Wednesday, December 23, 2015

SQL Agent job to run a task on a remote system that has SQL Server instance

Problem

The last step of a SQL Server Agent job has to run a windows task on a remote system that has no SQL Server instance installed.

Description

A SQL Server has to execute a number of database tasks by means of a SQL Server Agent job. This job is scheduled to run daily at a given time. The last step should execute a task on a remote Windows server. This remote server does not have SQL Server instance installed. A simple solution would be to schedule a Windows task on the remote server, but we will never be certain that the SQL Agent job finishes before the remote server starts the task, unless we schedule the windows task a lot later than the estimated end of the SQL job. We prefer the whole process to finish in as little time as necessary.
Therefore, the last step of the job should trigger the execution of the task on the remote system. The maintenance on the remote system is kept simpler  because there is no windows schedule to maintain that has to be aligned with the SQL Agent schedule. Also the timespan to finalize the complete job is kept as short as possible, compared to keeping a large enough timespan between the SQL schedule and the Windows task schedule.

Solution

The executable on the remote system can be run with a Windows user account that is already used as a Windows service account on the remote system. This account has a password that does not expire and it has sufficient rights to run the executable.
Powershell is available on both servers and can be used to invoke remote commands. Powershell can be scripted as a job step in SQL Server Agent.
There is a powershell script on the remote computer to run the executable.
What I need to do is to add a job step that invokes the powershell script on the remote computer with the credentials of the Windows service account.

Remote script setup

The script that contains the command to run the executable on the remote system is saved as C:\Scripts\Powershell\RunSomeExecutable.ps.
As proof of concept the script contains some lines to create a file in another directory of the remote server.
$text = 'Hello World'
$text | Out-File 'D:\Data\Test\file.txt'

Password encryption

The powershell command executed by the job step could contain the password hardcoded like this:
$pw = convertto-securestring -AsPlainText -Force -String tH1s1sAPa$$w0rd
$user = "domainname\username"
$cred = new-object -typename System.Management.Automation.PSCredential -argumentlist $user,$pw
If you want to encrypt the password, you have to encrypt it on the remote server and then enter the encryption string in the Powershell command in the Job step.
# Enter the Credentials details 
# DO THIS ON EVERY SQL SYSTEM locally logged in!!! Encryption will be different for every server

$password = read-host -prompt "Enter your Password"  
write-host "$password is password"  
$secure = ConvertTo-SecureString $password -force -asPlainText  
ConvertFrom-SecureString $secure |Out-File C:\Users\username\Documents\Powershell\Encryptpw.txt 
You can use the Windows Powershell ISE to create the code and run it.

Open Encryptpw.txt, it will contain a single line like this
01000000d08c8ddf0115d1118c7a00c04fc297eb01000000f4dc057c2362784b850a195b175f2e520000000002000000000003660000c000000010000000c5b288e86f6e31e4c8245f1ebbf12f070000000004800000a00000001000000084793ddf17bc3c116f3d991e469f188718000000405af26d131debb66922c4381a81edef6b996f030ac5165b14000000fa6a327ea9da87e868b491c18b8393e89002d713
Copy the encrypted contents to the clipboard, then paste into the first statement of the Job step command line, to assign it to the $encryptpw variable (see next setup step)

Job step setup with encrypted password

Step Name:
Type: Powershell
Run As: SQL Server Agent Service Account
Command:
$encryptpw = "01000000d08c8ddf0115d1118c7a00c04fc297eb01000000f4dc057c2362784b850a195b175f2e520000000002000000000003660000c000000010000000c5b288e86f6e31e4c8245f1ebbf12f070000000004800000a00000001000000084793ddf17bc3c116f3d991e469f188718000000405af26d131debb66922c4381a81edef6b996f030ac5165b14000000fa6a327ea9da87e868b491c18b8393e89002d713"  
$pw = ConvertTo-SecureString -string $encryptpw
$user = "domainname\username"
$cred = new-object -typename System.Management.Automation.PSCredential -argumentlist $user,$pw
$computer = "remotecomputername"
$session = new-pssession -computername $computer -credential $cred
Invoke-Command -Session $session -ScriptBlock {Invoke-Expression "C:\Scripts\Powershell\RunSomeExecutable.ps1" }
Save the job-step and the job and then run it
 Check on the remote server if the task was executed. In my example: if the file has been created.

SQLAgent proxy

You should test the job in a scheduled run as well, because the job will run with the operator 'SQL Server Agent Service Account'. If this account does not have sufficient rights to run the powershell script remotely, you will need to create a SQL Agent proxy with rights to run the powershell script.

Friday, December 18, 2015

Do not forget to enable Database Mail in SQL Server Agent too

After setting up Database Mail on a SQL Server, do not forget to enable the mail feature of SQL Server Agent.

Monday, November 30, 2015

SQL Server reduce the size of an overdimensioned transaction log file

use master
go

SELECT name, log_reuse_wait_desc FROM sys.databases

USE Mydatabase
go

CHECKPOINT

DBCC SHRINKFILE (N'MESTBIDW_DEV_log' , 48000)
GO

DBCC SQLPERF(LOGSPACE) --Optional -- just to see how much space
DBCC LOGINFO --Optional

Thursday, May 28, 2015

How to install the Notepad++ language file for the PowerQuery formula language

The PowerQuery product is explaind here: Introduction to Microsoft Power Query for Excel
As of the moment of this writing, the editor for the “M” language does not offer color syntax or intellisense. One way to aid in writing formulas is provided by Matt Mason who made a Notepad++ language file for M.
You can download the language file from his webpage Notepad++ language file for the Power Query formula language (“M”)
The downloaded file is called “mlang_npp.xml”.
Next we need to install the language file in Notepad++
  • Open Notepad++
  • From the menu bar select Language > Define your language…
  • In the ‘user defined language’ dialog window, click on the ‘Import…’ button
    In the ‘Open file’ dialog window, browse to your download folder, select “mlang_npp.xml” and click ‘Open’.
  • The file should load and you see a success dialog box, click OK.
  • Close the ‘user defined language’ dialog window.
  • Close Notepad++ and reopen.
  • Open the Language menu, at the bottom you will see the “M” language choice, below “Define your language…” (not in the M submenu).
  • With the “M” language selected, the code in your editor has syntax highlighting and syntax folding.
Most likely Microsoft will add syntax highlighting some time in the future. For now the Notepad ++ solution is helpful to build complex “M” statements.

Monday, May 04, 2015

Migrating my Postman collections from one MS-Windows system to another

Right now I’m in the process of migrating developer stuff from my old MS-Windows pc to a new one. On the old machine I’ve used the Postman tool often to test and deploy new web-services. After having installed the tool on the new pc I wanted to copy my collections from the old pc. This article describes how to accomplish this.

  • Start Postman on the old system
  • My collections
    image
  • In the Menu Bar click on the Settings icon
    image
  • In the Settings dialog window, click on the Data tab
    image
  • Click on the Download button to download your collections in a dump file. The ‘Save As’ dialog window opens. Select a name and location on a file share that can be reached by the new PC.
    image
  • Save the file.
  • Close the Postman tool on the old PC.
  • Start the Postman tool on the new PC.
  • In the Menu Bar click on the Settings icon.
  • In the Settings dialog window, click on the Data tab.
  • This time click on the ‘Select file’ button in the ‘Import Data’ section.
  • In the ‘Open file’ dialog select the postman dump file you created in the file share.
  • Click Open and that’s it. My collections are now available on the new PC.

Wednesday, April 22, 2015

Using Translations with Analysis Services and MS-Excel 2013

Analysis Services Translations

Analysis Services (SSAS) provides an easy translation mechanism for a richer end-user experience. SSAS Translations allows for the translation of the captions of dimensions and facts as well as the translation of the content of dimension name columns. To easily show the cube in his preferred language the end-user has to use a tool that makes use of SSAS Translations. MS-Excel 2013 is such a tool. Depending on the installed display language of MS-Excel and provided the cube has a translation for the display language the cube and dimensions will automatically be shown in the default display language. Provided the necessary language packs for Office are installed, end-users can change the display language and create MS-Excel reports in the language they need, as long as the SSAS cube provides translations in the display language in MS-Excel.

Setup dimension tables with different languages in SQL-Server

Every dimension table that has name columns to display the dimension members, should have additional name columns for every language end users request. There is always a default name column.

For instance: a dimension table called DimBlocking

CREATE TABLE [dbo].[DimBlocking](
[BlockingId] [int] NOT NULL,
[BlockingName] [nvarchar](50) NOT NULL,
CONSTRAINT [PK_DimBlocking] PRIMARY KEY CLUSTERED
(
[BlockingId] ASC
) ON [PRIMARY]
) ON [PRIMARY]
 

This dimension has a key column ‘BlockingId’ and a name column ‘BlockingName’. This name column will serve as the default name column in the SSAS dimension specification.

This table example has 2 rows of data












BlockingId


BlockingName


0


Non-blocking


1


Blocking


To provide for other languages like Dutch and French, just add name columns of the same type and size as the default name column, e.g. BlockingNameNL for Dutch and BlockingNameFR for French.



CREATE TABLE [dbo].[DimBlocking](
    [BlockingId] [int] NOT NULL,
    [BlockingName] [nvarchar](50) NOT NULL,
    [BlockingNameNL] [nvarchar](50) NOT NULL,
    [BlockingNameFR] [nvarchar](50) NOT NULL,

After adding the translation columns:


















BlockingId


BlockingName


BlockingNameNL


BlockingNameFR


0


Non-blocking


Niet-blokkerend


Non-bloquant


1


Blocking


Blokkerend


Bloquant


Setup MS-Excel to use the different languages provided by a cube in SSAS


Office Language preferences


By default MS-Excel is installed with one Display language. This can be verified in the Options dialog window of MS-Excel, under the Language tab. In the screenshot below, Office 2013 was installed with English Display language.

image

Install additional language packs from an Office language pack as needed. There are a few considerations when making translations for a country like Belgium where ‘Dutch (Belgium)’ is installed as Windows language, but the language pack for Office 2013 only offers Dutch from The Netherlands: ‘Nederlands (Nederland)’. Take care to setup SSAS translations to match the name of the Display language and not of the Editing language.

Setup dimension translations for different languages in SSAS


Data Source View


Make sure that the Data Source View is refreshed to read the metadata for the new name columns. If your dimension table is based on a query, then adapt the query to read the new name columns. The table DimBlocking has been given the Friendly name ‘Blocking’.

image

Dimension designer - Structure


The dimension has a key and a name column. Most often a simple dimension with a key and a name has one Attribute, in this example called ‘Blocking’ with KeyColumn = Blocking.BlockingId and NameColumn = Blocking.BlockingName.

image

Dimension Designer – Translations


In the Translations design, we can add a new translations for the captions, for the Attributes and also for the [All] and [Unknown] members.

Preparation to translate [All] and [Unknown] members

The captions of these 2 members appear by default when browsing a dimension:

image

To make [All] and [Unknown] translatable we can explicitly specify them in the Dimension properties.

In the Dimension Structure design, select the Dimension in the Attributes Pane, and look for the following 2 Dimension properties:


  • AttributeAllMemberName: set this to ‘All’ to enable translation. (Or use another word, e.g. Everything)
    image

  • UnknownMemberName: set this to ‘Unknown’ to enable translation (Or use any other word, like ‘Unidentified’)

Add Translations

In the Dimension Designer go to the Translations tab.

The Dimension properties that have translatable captions:


  • Caption

  • AttributeAllMemberName

  • UnknownMemberName.

And for the Attributes:


  • Caption

image


Click on the ‘Add new translation’ button

image

The Select language dialog box opens:

image

Select the language needed and make sure you have an exact match with the display language installed in MS-Office. E.g. choose ‘Dutch (Netherlands)’ when the Office 2013 language pack for ‘Nederlands (Nederland)’ is installed. Because I have a Dutch Office 2013 installation, the display language is shown translated by MS-Office.

For the Dimension Caption and Member names, type text in the Translation column.

image

For the Attribute translation, there is a translation for the caption, but here you can select a column name to use as translation for the attribute data. Click inside the translation column and you see a button. Click on the button to open the Attribute Data Translation dialog window. Enter the Translated caption and select the Translation column from the dimension table.

image

Save and deploy this dimension, make sure it is processed.

Connect MS-Excel to the SSAS cube


Use the default display language


The Dutch version of MS-Excel is setup to use ‘Nederlands’ as display language:

image

In the spreadsheet, connect to the Analysis Services Cube and create a new pivot table. The Pivot table fields shows the Blocking dimension with translation.

image

After selection of this dimension in the Rows section, the pivot table shows the Attribute data translated in Dutch.

image

Excel row 2, 3 and 4 show the translated members of the ‘Blocking’ attribute.

MS-Excel automatically translates ‘Row-labels’ and ‘Grand total’ as ‘Rijlabels’ and ‘Eindtotaal’. This is not a function of the SSAS translation.

Change to another display language


  • Change the Display language to French.

  • Close and restart MS-Excel.

  • Open the same spreadsheet.

  • In the Data section, click on the Refresh button.

  • The pivot table is now shown with French captions and values.

image

Monday, April 20, 2015

SSRS Configure the Unattended Execution Account

One way to run scheduled reports is to use SQL server logins and store the credentials in the data source of the report. For every SQL server that needs to be accessed to read the report data the SQL account credentials must be maintained. The same credentials must be saved in every data source on the report server.

An alternative is to run SQL reports with an AD windows account. The AD account has to be granted the same rights to read report data, but without saving credentials. The Reporting Service Server stores the credentials once in the Unattended Execution Account of SSRS.

Configure the Unattended Execution Account

Start the Reporting Services Configuration manager

image

Connect to the Reporting services Server

image

In the left pane of the configuration manager, select ‘Execution Account’. Check ‘Specify an account’ and enter the AD account credentials.

image

Click on ‘Apply’. A dialog box will open to give the name for a key file with it’s own password.

Give the unattended account read rights in the SQL Server database

Add the AD account as a windows login to the SQL Server. Add the login as user to the database. Grant the DB user sufficient select and/or execute rights in the database to read the report data.

Use the unattended account in the Report Data Sources

The Connection string must use ‘Integrated Security=SSPI’

For Shared data sources on a report server. Browse to the Data Sources folder of the report server.

Select a Data Source and click on it , or open the dropdown and select ‘Manage’.

In the properties tab, in the connection string textbox add the ‘;Integrated Security=SSPI’ Select ‘Credentials are not required’.

image

click Apply.

Conclusion

Account administration is managed by AD admins. The SQL Admin only needs to define the AD as a user login on the SQL server level and grant access rights on the database level. No need to maintain passwords for every SQL server.

The Report server stores the unattended account once, hence it is no longer necessary to maintain SQL Login credentials and passwords for every Data Source deployed on the report server.

Thursday, April 16, 2015

Query to find specific job steps in SQL Server Agent jobs

To compare similar job steps on a server that ran several of Ola Hallengrens Maintenance scripts I came across this example to Get all job steps in SQL Server by Sufian Rashid.

I changed the WHERE condition to look for jobs names ending in ‘FULL’ and which executed a ‘sqlcmd’ statement:

SELECT JOB.NAME, STEP.STEP_ID, STEP.STEP_NAME, STEP.COMMAND
FROM Msdb.dbo.SysJobs JOB
    INNER JOIN Msdb.dbo.SysJobSteps STEP ON STEP.Job_Id = JOB.Job_Id
WHERE JOB.Enabled = 1
    AND (JOB.Name LIKE '%FULL' and STEP.COMMAND LIKE 'sqlcmd%')
ORDER BY JOB.NAME, STEP.STEP_ID

The query result:


image

This list allows me to easily compare the commands executed.

E.g. I saw that on line 3 there was as parameter with value 47, whereas on line 4 I had the same parameter with value 48.

The LogToTable parameter for the instance SQL1\STORE has value ‘N’, whereas this parameter has value ‘Y’ for the SQL1\PRD instance.

Tuesday, April 07, 2015

Deploying SSAS databases

Deploy AS database to test environment with VS 2012

With Visual Studio 2012 and SQL Server Data Tools 11.1.50318.0

Project configuration Active Solution is set to Development.

From within Visual Studio, right-click on the SSAS project, Select ‘Deploy’ in the dropdown menu.

Deploy AS database to production with Deployment Wizard

Start Analysis Services deployment wizard

image

Welcome screen – Next

image

Database file

Browse to the bin folder of your VS2012 Analysis services project and look for the file with the asdatabase extension. Next

image

Installation target. Enter Server name and Database name as it should appear on that server. Next

image

Specify Options for partitions and roles. Next

image

Specify configuration Properties. Check ‘Retain configuration settings for existing objects’.

image

Select Processing Options. Default processing. Next.

image

Confirm deployment. Next.

Deployment is done.

Sunday, April 05, 2015

Microsoft Hands-On Lab Implementing and Managing AlwaysOn Availability groups

When you have not yet had the opportunity to experience the SQL 2012 AlwaysOn availability feature, on Technet there is a Hands-On Lab that gives some insight:

Implementing and Managing AlwaysOn Availability groups

The exercises in the lab not only sheds some light on how a basic AlwaysOn configuration can be setup by the DBA, it also shows 2 examples of how to setup the connection string for a CRUD and a reporting application.

Thursday, March 26, 2015

System backup software triggers unwanted SQL backups

Problem

Sometimes backup software that is used to backup the Windows servers on which SQL Server is running will trigger a SQL backup automatically. Although you may have setup a neat SQL backup schedule in SQL Server Agent with Full, Differential and transaction log backups, consequences are that whenever this Server backup software kicks in and triggers another full backup of your databases, your backup files may be the wrong ones in case you ever need to do a restore with your Full, differential and transaction log backups, because the order in which they are created by you is broken by the unwanted backup in the background.

You do not see the unwanted backups in the SQL server Agent log, but you will find them back in the standard SQL server logs. The SQL server log typically show lines like this:

  • Database backed up. Database: master, creation date(time): 2015/02/19(16:52:39), pages dumped: 1356, first LSN: 388:415:117, last LSN: 388:465:1, number of dump devices: 1, device information: (FILE=1, TYPE=VIRTUAL_DEVICE: {'{9603DF6C-C7E9-4A19-BAAF-C827EFE22023}7'}).
  • I/O was resumed on database master. No user action is required.
  • I/O is frozen on database master. No user action is required. However, if I/O is not resumed promptly, you could cancel the backup.

The same messages are repeated for all the other system and user databases on that particular server.

Cause

The SQL server service called ‘SQL Server VSS Writer’ is used by the Windows backup software to trigger the SQL server backup. From SQL BOL:

When running, Database Engine locks and has exclusive access to the data files. When the SQL Writer Service is not running, backup programs running in Windows do not have access to the data files, and backups must be performed using SQL Server backup.

  • ‘Use the SQL Writer Service to permit Windows backup programs to copy SQL Server data files while SQL Server is running.’

Solution

In the Services management console: Stop and disable the ‘SQL Server VSS Writer’ service.

image

Working with environments on a deployed SSIS 2012 Project

When an SSIS project is deployed from SQL Server Data Tools to the Integration Services Catalog, the Project and Package connections and parameters are stored together with the project and the packages. Each parameter or connection property in de Catalog is reconfigurable by means of environments. With environments properly setup, developers do not need to worry about changing connections or parameters when they release an upgrade to the Production server. Because the production server environment will automatically replace the test values in the connections and in the parameters by values setup in the production environment.

Catalog folder after 1st time deployment

An example of a project in the catalog with 1 package. No Environments have been created yet.

image

Create Environments

In this example we will create 2 new environments with a sql script: Test and Production.

---- create environment
EXEC [SSISDB].[catalog].[create_environment] @environment_name=N'Production', @environment_description=N'', @folder_name=N'SSIS_2012_Test'
GO
EXEC [SSISDB].[catalog].[create_environment] @environment_name=N'Test', @environment_description=N'', @folder_name=N'SSIS_2012_Test'
GO

After the script is executed: the 2 new environments are visible in the Environments folder.


image


Create environment variables


For project parameters



There is a project parameter in the SSIS project ‘DaysToLoad’ with value 15. This is shown in the project configuration, right-click on the project and select ‘Configure…’


image


Or with a T-SQL script, create an environment variable ‘prjDaysToLoad’ in both environments:



DECLARE @var int = N'30'
EXEC [SSISDB].[catalog].[create_environment_variable] @variable_name=N'prjDaysToLoad', @sensitive=False, @description=N''
    , @environment_name=N'Production', @folder_name=N'SSIS_2012_Test', @value=@var, @data_type=N'Int32'
EXEC [SSISDB].[catalog].[create_environment_variable] @variable_name=N'prjDaysToLoad', @sensitive=False, @description=N''
    , @environment_name=N'Test', @folder_name=N'SSIS_2012_Test', @value=@var, @data_type=N'Int32'
GO

Add a reference to the environment in the reference


The project needs to be made aware of the available environments first.


You can do this by right-clicking on the project name and select ‘Configure…’ and add the reference in the References panel. Or you can do it with a T-SQL script



Declare @reference_id bigint
EXEC [SSISDB].[catalog].[create_environment_reference] @environment_name=N'Production', @reference_id=@reference_id OUTPUT, @project_name=N'SSIS_2012_TestOdata3', @folder_name=N'SSIS_2012_Test', @reference_type=R
EXEC [SSISDB].[catalog].[create_environment_reference] @environment_name=N'Test', @reference_id=@reference_id OUTPUT, @project_name=N'SSIS_2012_TestOdata3', @folder_name=N'SSIS_2012_Test', @reference_type=R
GO

The result in the configure dialog window looks like this


image


Create an environment variable in production and test environment



---- create env variable in the 2 environments
DECLARE @var int = N'30'
EXEC [SSISDB].[catalog].[create_environment_variable] @variable_name=N'prjDaysToLoad', @sensitive=False, @description=N''
    , @environment_name=N'Production', @folder_name=N'SSIS_2012_Test', @value=@var, @data_type=N'Int32'
EXEC [SSISDB].[catalog].[create_environment_variable] @variable_name=N'prjDaysToLoad', @sensitive=False, @description=N''
    , @environment_name=N'Test', @folder_name=N'SSIS_2012_Test', @value=@var, @data_type=N'Int32'
GO

See the result in the ‘Production’ environment


image



Reference the environment variable from a parameter in the Project


image


or do it with a T-SQL script



---- reference the environment from the project parameter
EXEC [SSISDB].[catalog].[set_object_parameter_value] @object_type=20, @parameter_name=N'DaysToLoad', @object_name=N'SSIS_2012_TestOdata3', @folder_name=N'SSIS_2012_Test', @project_name=N'SSIS_2012_TestOdata3', @value_type=R, @parameter_value=N'prjDaysToLoad'
GO


This is the result


image


Be careful: The name of the environment variable needs to be identical in both environments


Run the package manually from the Catalog with a reference to the environment


When a package in this project is run we add the reference to one of the environments ‘Production’ or ‘Test’.


Right-click on a package to Execute. The execute package will warn that no environment is selected for the parameter that has a reference:


image


Check the Environment checkbox and select one of the environments


image


When OK is clicked the package will use the value for the parameter that comes from the selected environment.


Run the package in a SQL Server Agent job with a reference to the environment


In a job step the procedure is similar, select the package and check the environment checkbox and select one of the environments.


image


Conclusion


Once SSIS projects are deployed from a developer environment to a production environment you do not need to change parameter or connection settings in your Visual Studio project before the deployment. In the Integration Services you can set up one or more environments with variables for parameters and connections that differ from the developer environment. The developer has no worries about deploying in the production environment (either directly or through a deployment script), because the DBA administrator can setup up an environment in the Catalog so that the packages connect to databases and use parameters adapted for this environment.


In a production environment I tend to create only the ‘production’ environment and on a test or QA server I create only the ‘test’ environment. But you can still use several different environments in a production environment the package may need to address ETL operations on several production servers.

Schedule to recycle the SQL Server Logs

A new SQL Server log file is created each time the SQL service is restarted. On a production server that is very sparingly restarted this can result in enormous log files.

Below is a script to schedule a weekly recycle of the SQL Server Logs. The number of log files is set to 12 before they are recycled. Thereby giving a 3 month long history of the logs.

Other combinations are possible, e.g. make a daily schedule and set the number of files to 30, this way the log files are shorter and the history is one month long.

---- set SQL server log recycle property to Recycle after 12 files
USE [master]
GO
EXEC xp_instance_regwrite N'HKEY_LOCAL_MACHINE', N'Software\Microsoft\MSSQLServer\MSSQLServer', N'NumErrorLogs', REG_DWORD, 12
GO
---- Create SQL Server Agent Job to do a weekly recycle (Saturday 12:00) of the SQL Server log
USE [msdb]
GO
DECLARE @jobId BINARY(16)
EXEC  msdb.dbo.sp_add_job @job_name=N'MSOL Recycle SQL Server Logs', 
        @enabled=1, 
        @notify_level_eventlog=0, 
        @notify_level_email=2, 
        @notify_level_netsend=2, 
        @notify_level_page=2, 
        @delete_level=0, 
        @category_name=N'Database Maintenance', 
        @owner_login_name=N'sa', 
        @notify_email_operator_name=N'ICT Helpdesk', @job_id = @jobId OUTPUT
select @jobId
GO
EXEC msdb.dbo.sp_add_jobserver @job_name=N'MSOL Recycle SQL Server Logs', @server_name = N'VMXYZ'
GO
USE [msdb]
GO
EXEC msdb.dbo.sp_add_jobstep @job_name=N'MSOL Recycle SQL Server Logs', @step_name=N'RecycleErrorLog', 
        @step_id=1, 
        @cmdexec_success_code=0, 
        @on_success_action=1, 
        @on_fail_action=2, 
        @retry_attempts=0, 
        @retry_interval=0, 
        @os_run_priority=0, @subsystem=N'TSQL', 
        @command=N'EXEC sp_cycle_errorlog', 
        @database_name=N'master', 
        @flags=0
GO
USE [msdb]
GO
EXEC msdb.dbo.sp_update_job @job_name=N'MSOL Recycle SQL Server Logs', 
        @enabled=1, 
        @start_step_id=1, 
        @notify_level_eventlog=0, 
        @notify_level_email=2, 
        @notify_level_netsend=2, 
        @notify_level_page=2, 
        @delete_level=0, 
        @description=N'', 
        @category_name=N'Database Maintenance', 
        @owner_login_name=N'sa', 
        @notify_email_operator_name=N'ICT Helpdesk', 
        @notify_netsend_operator_name=N'', 
        @notify_page_operator_name=N''
GO
USE [msdb]
GO
DECLARE @schedule_id int
EXEC msdb.dbo.sp_add_jobschedule @job_name=N'MSOL Recycle SQL Server Logs', @name=N'Weekly Saturday 12:00', 
        @enabled=1, 
        @freq_type=8, 
        @freq_interval=64, 
        @freq_subday_type=1, 
        @freq_subday_interval=0, 
        @freq_relative_interval=0, 
        @freq_recurrence_factor=1, 
        @active_start_date=20150326, 
        @active_end_date=99991231, 
        @active_start_time=120000, 
        @active_end_time=235959, @schedule_id = @schedule_id OUTPUT
select @schedule_id
GO



Wednesday, March 25, 2015

SSIS 2012 package with Odata source–execution failed (part 3)

In post SSIS 2012 package with Odata source-failure (part 2) I was able to deploy my package with Odata source. But the package failed to run.

Change from Project deployment to Package Deployment

I have converted my project with package connections from project deployment to Package deployment. Then I deployed the package on the server in MSDB catalog from SSIS. I.e. I connect with SSMS to Integration Services as an administrator and under the MSDB folder I added a new folder for my package. I then imported the dtsx into the mdsb package store. The Import worked. Still from within SSMS and connected to the Integration Services, i right-clicked on the package in the MSDB store and selected 'Run package'. In the Execute Package dialog window, I clicked on 'Execute' and the package actually ran. The Odata connector was opened, it read the data and imported into my database.

This experience made me realize what the real culprit  was: The OData connector is a 32-bit connector in SQL Server 2012. 

Odata source is 32-bit

To prove this: with SSMS I connected to the SQL database server and in the Integration services catalog, i right-clicked on the package that was deployed in Project deployment mode, then selected 'Execute...'. In the execute dialog windo in the 'Advanced' tab I checked 32-bit runtime. And the package ran.

Final conclusion

  1. The Project deployment will work, but not with the Odata source as project connection. The Odata connection must be defined in each package as a package connection.
  2. Run the deployed packages in 32-bit mode on the server

Tuesday, March 24, 2015

SSIS 2012 package with Odata source-failure (part 2)

Deployment failure

In my previous post SSIS 2012 package with Odata source–failed to deploy project I have encountered a deployment problem with an SSIS project created in project deployment mode with VS2012 and deployed to SQL 2012.

In this package I have an OData connection to a SharePoint Odata service. I put the connection under the connection managers so that I could share the same connection amongst several packages in the project. The deployment failed on the system on which I was developing, as well as on another SQL 2012 server.

This a project connection

image

Maybe the OData should not be in the project connection managers, but defined each time as a package connection. I decided to create a new SSIS project with one package with the Odata connection defined in the package.

This is a Package connection

image

Change OData from Project to Package connection

The project with the Package connections deploys without errors in project deployment mode.

image

The project and package can be seen in the Integration Services catalog:

image

Package deploys but Executes with runtime errors

When I Execute the package, it fails with error

‘Error: The connection type "ODATA" specified for connection manager "OData Incidents" is not recognized as a valid connection manager type. This error is returned when an attempt is made to create a connection manager for an unknown connection type. Check the spelling in the connection type name.’

I shall have to investigate further.

SSIS 2012 package with Odata source–failed to deploy project

On a test server with VS 2012 and SQL 2012 installed, I have created an SSIS project in project deployment mode. In the packages of the project I use the Odata source for SSIS. The packages run fine in VS 2012. But when I deploy the package to the Integration Services catalog on the same server as where I am running VS 2012 I get a deployment error:

‘Failed to deploy project. For more information, query the operation_messages view for the operation identifier '25084'.  (Microsoft SQL Server, Error: 27203)

image

To find out the cause of the problem, you can connect with SSMS to the SSISDB database where the SSIS projects are stored. Then select the record with operation_id from the operation_messages as suggested in the above-mentioned errormessage.

use SSISDB
go

select * from catalog.operation_messages where operation_id = 25084



The message column in the result contains the cause of the error:


‘Failed to deploy the project. Fix the problems and try again later.:Unable to create the type with the name 'ODATA'.’


In my case the deployment failed of SSIS packages that use the Odata source on the same server as where I tested successfully with VS2012.


Now I have to find out what is wrong, see SSIS 2012 package with Odata source-failure (part 2)