Showing posts with label SQL Server 2012. Show all posts
Showing posts with label SQL Server 2012. Show all posts

Wednesday, December 23, 2015

SQL Agent job to run a task on a remote system that has SQL Server instance

Problem

The last step of a SQL Server Agent job has to run a windows task on a remote system that has no SQL Server instance installed.

Description

A SQL Server has to execute a number of database tasks by means of a SQL Server Agent job. This job is scheduled to run daily at a given time. The last step should execute a task on a remote Windows server. This remote server does not have SQL Server instance installed. A simple solution would be to schedule a Windows task on the remote server, but we will never be certain that the SQL Agent job finishes before the remote server starts the task, unless we schedule the windows task a lot later than the estimated end of the SQL job. We prefer the whole process to finish in as little time as necessary.
Therefore, the last step of the job should trigger the execution of the task on the remote system. The maintenance on the remote system is kept simpler  because there is no windows schedule to maintain that has to be aligned with the SQL Agent schedule. Also the timespan to finalize the complete job is kept as short as possible, compared to keeping a large enough timespan between the SQL schedule and the Windows task schedule.

Solution

The executable on the remote system can be run with a Windows user account that is already used as a Windows service account on the remote system. This account has a password that does not expire and it has sufficient rights to run the executable.
Powershell is available on both servers and can be used to invoke remote commands. Powershell can be scripted as a job step in SQL Server Agent.
There is a powershell script on the remote computer to run the executable.
What I need to do is to add a job step that invokes the powershell script on the remote computer with the credentials of the Windows service account.

Remote script setup

The script that contains the command to run the executable on the remote system is saved as C:\Scripts\Powershell\RunSomeExecutable.ps.
As proof of concept the script contains some lines to create a file in another directory of the remote server.
$text = 'Hello World'
$text | Out-File 'D:\Data\Test\file.txt'

Password encryption

The powershell command executed by the job step could contain the password hardcoded like this:
$pw = convertto-securestring -AsPlainText -Force -String tH1s1sAPa$$w0rd
$user = "domainname\username"
$cred = new-object -typename System.Management.Automation.PSCredential -argumentlist $user,$pw
If you want to encrypt the password, you have to encrypt it on the remote server and then enter the encryption string in the Powershell command in the Job step.
# Enter the Credentials details 
# DO THIS ON EVERY SQL SYSTEM locally logged in!!! Encryption will be different for every server

$password = read-host -prompt "Enter your Password"  
write-host "$password is password"  
$secure = ConvertTo-SecureString $password -force -asPlainText  
ConvertFrom-SecureString $secure |Out-File C:\Users\username\Documents\Powershell\Encryptpw.txt 
You can use the Windows Powershell ISE to create the code and run it.

Open Encryptpw.txt, it will contain a single line like this
01000000d08c8ddf0115d1118c7a00c04fc297eb01000000f4dc057c2362784b850a195b175f2e520000000002000000000003660000c000000010000000c5b288e86f6e31e4c8245f1ebbf12f070000000004800000a00000001000000084793ddf17bc3c116f3d991e469f188718000000405af26d131debb66922c4381a81edef6b996f030ac5165b14000000fa6a327ea9da87e868b491c18b8393e89002d713
Copy the encrypted contents to the clipboard, then paste into the first statement of the Job step command line, to assign it to the $encryptpw variable (see next setup step)

Job step setup with encrypted password

Step Name:
Type: Powershell
Run As: SQL Server Agent Service Account
Command:
$encryptpw = "01000000d08c8ddf0115d1118c7a00c04fc297eb01000000f4dc057c2362784b850a195b175f2e520000000002000000000003660000c000000010000000c5b288e86f6e31e4c8245f1ebbf12f070000000004800000a00000001000000084793ddf17bc3c116f3d991e469f188718000000405af26d131debb66922c4381a81edef6b996f030ac5165b14000000fa6a327ea9da87e868b491c18b8393e89002d713"  
$pw = ConvertTo-SecureString -string $encryptpw
$user = "domainname\username"
$cred = new-object -typename System.Management.Automation.PSCredential -argumentlist $user,$pw
$computer = "remotecomputername"
$session = new-pssession -computername $computer -credential $cred
Invoke-Command -Session $session -ScriptBlock {Invoke-Expression "C:\Scripts\Powershell\RunSomeExecutable.ps1" }
Save the job-step and the job and then run it
 Check on the remote server if the task was executed. In my example: if the file has been created.

SQLAgent proxy

You should test the job in a scheduled run as well, because the job will run with the operator 'SQL Server Agent Service Account'. If this account does not have sufficient rights to run the powershell script remotely, you will need to create a SQL Agent proxy with rights to run the powershell script.

Monday, November 30, 2015

SQL Server reduce the size of an overdimensioned transaction log file

use master
go

SELECT name, log_reuse_wait_desc FROM sys.databases

USE Mydatabase
go

CHECKPOINT

DBCC SHRINKFILE (N'MESTBIDW_DEV_log' , 48000)
GO

DBCC SQLPERF(LOGSPACE) --Optional -- just to see how much space
DBCC LOGINFO --Optional

Tuesday, April 07, 2015

Deploying SSAS databases

Deploy AS database to test environment with VS 2012

With Visual Studio 2012 and SQL Server Data Tools 11.1.50318.0

Project configuration Active Solution is set to Development.

From within Visual Studio, right-click on the SSAS project, Select ‘Deploy’ in the dropdown menu.

Deploy AS database to production with Deployment Wizard

Start Analysis Services deployment wizard

image

Welcome screen – Next

image

Database file

Browse to the bin folder of your VS2012 Analysis services project and look for the file with the asdatabase extension. Next

image

Installation target. Enter Server name and Database name as it should appear on that server. Next

image

Specify Options for partitions and roles. Next

image

Specify configuration Properties. Check ‘Retain configuration settings for existing objects’.

image

Select Processing Options. Default processing. Next.

image

Confirm deployment. Next.

Deployment is done.

Thursday, March 26, 2015

Working with environments on a deployed SSIS 2012 Project

When an SSIS project is deployed from SQL Server Data Tools to the Integration Services Catalog, the Project and Package connections and parameters are stored together with the project and the packages. Each parameter or connection property in de Catalog is reconfigurable by means of environments. With environments properly setup, developers do not need to worry about changing connections or parameters when they release an upgrade to the Production server. Because the production server environment will automatically replace the test values in the connections and in the parameters by values setup in the production environment.

Catalog folder after 1st time deployment

An example of a project in the catalog with 1 package. No Environments have been created yet.

image

Create Environments

In this example we will create 2 new environments with a sql script: Test and Production.

---- create environment
EXEC [SSISDB].[catalog].[create_environment] @environment_name=N'Production', @environment_description=N'', @folder_name=N'SSIS_2012_Test'
GO
EXEC [SSISDB].[catalog].[create_environment] @environment_name=N'Test', @environment_description=N'', @folder_name=N'SSIS_2012_Test'
GO

After the script is executed: the 2 new environments are visible in the Environments folder.


image


Create environment variables


For project parameters



There is a project parameter in the SSIS project ‘DaysToLoad’ with value 15. This is shown in the project configuration, right-click on the project and select ‘Configure…’


image


Or with a T-SQL script, create an environment variable ‘prjDaysToLoad’ in both environments:



DECLARE @var int = N'30'
EXEC [SSISDB].[catalog].[create_environment_variable] @variable_name=N'prjDaysToLoad', @sensitive=False, @description=N''
    , @environment_name=N'Production', @folder_name=N'SSIS_2012_Test', @value=@var, @data_type=N'Int32'
EXEC [SSISDB].[catalog].[create_environment_variable] @variable_name=N'prjDaysToLoad', @sensitive=False, @description=N''
    , @environment_name=N'Test', @folder_name=N'SSIS_2012_Test', @value=@var, @data_type=N'Int32'
GO

Add a reference to the environment in the reference


The project needs to be made aware of the available environments first.


You can do this by right-clicking on the project name and select ‘Configure…’ and add the reference in the References panel. Or you can do it with a T-SQL script



Declare @reference_id bigint
EXEC [SSISDB].[catalog].[create_environment_reference] @environment_name=N'Production', @reference_id=@reference_id OUTPUT, @project_name=N'SSIS_2012_TestOdata3', @folder_name=N'SSIS_2012_Test', @reference_type=R
EXEC [SSISDB].[catalog].[create_environment_reference] @environment_name=N'Test', @reference_id=@reference_id OUTPUT, @project_name=N'SSIS_2012_TestOdata3', @folder_name=N'SSIS_2012_Test', @reference_type=R
GO

The result in the configure dialog window looks like this


image


Create an environment variable in production and test environment



---- create env variable in the 2 environments
DECLARE @var int = N'30'
EXEC [SSISDB].[catalog].[create_environment_variable] @variable_name=N'prjDaysToLoad', @sensitive=False, @description=N''
    , @environment_name=N'Production', @folder_name=N'SSIS_2012_Test', @value=@var, @data_type=N'Int32'
EXEC [SSISDB].[catalog].[create_environment_variable] @variable_name=N'prjDaysToLoad', @sensitive=False, @description=N''
    , @environment_name=N'Test', @folder_name=N'SSIS_2012_Test', @value=@var, @data_type=N'Int32'
GO

See the result in the ‘Production’ environment


image



Reference the environment variable from a parameter in the Project


image


or do it with a T-SQL script



---- reference the environment from the project parameter
EXEC [SSISDB].[catalog].[set_object_parameter_value] @object_type=20, @parameter_name=N'DaysToLoad', @object_name=N'SSIS_2012_TestOdata3', @folder_name=N'SSIS_2012_Test', @project_name=N'SSIS_2012_TestOdata3', @value_type=R, @parameter_value=N'prjDaysToLoad'
GO


This is the result


image


Be careful: The name of the environment variable needs to be identical in both environments


Run the package manually from the Catalog with a reference to the environment


When a package in this project is run we add the reference to one of the environments ‘Production’ or ‘Test’.


Right-click on a package to Execute. The execute package will warn that no environment is selected for the parameter that has a reference:


image


Check the Environment checkbox and select one of the environments


image


When OK is clicked the package will use the value for the parameter that comes from the selected environment.


Run the package in a SQL Server Agent job with a reference to the environment


In a job step the procedure is similar, select the package and check the environment checkbox and select one of the environments.


image


Conclusion


Once SSIS projects are deployed from a developer environment to a production environment you do not need to change parameter or connection settings in your Visual Studio project before the deployment. In the Integration Services you can set up one or more environments with variables for parameters and connections that differ from the developer environment. The developer has no worries about deploying in the production environment (either directly or through a deployment script), because the DBA administrator can setup up an environment in the Catalog so that the packages connect to databases and use parameters adapted for this environment.


In a production environment I tend to create only the ‘production’ environment and on a test or QA server I create only the ‘test’ environment. But you can still use several different environments in a production environment the package may need to address ETL operations on several production servers.

Schedule to recycle the SQL Server Logs

A new SQL Server log file is created each time the SQL service is restarted. On a production server that is very sparingly restarted this can result in enormous log files.

Below is a script to schedule a weekly recycle of the SQL Server Logs. The number of log files is set to 12 before they are recycled. Thereby giving a 3 month long history of the logs.

Other combinations are possible, e.g. make a daily schedule and set the number of files to 30, this way the log files are shorter and the history is one month long.

---- set SQL server log recycle property to Recycle after 12 files
USE [master]
GO
EXEC xp_instance_regwrite N'HKEY_LOCAL_MACHINE', N'Software\Microsoft\MSSQLServer\MSSQLServer', N'NumErrorLogs', REG_DWORD, 12
GO
---- Create SQL Server Agent Job to do a weekly recycle (Saturday 12:00) of the SQL Server log
USE [msdb]
GO
DECLARE @jobId BINARY(16)
EXEC  msdb.dbo.sp_add_job @job_name=N'MSOL Recycle SQL Server Logs', 
        @enabled=1, 
        @notify_level_eventlog=0, 
        @notify_level_email=2, 
        @notify_level_netsend=2, 
        @notify_level_page=2, 
        @delete_level=0, 
        @category_name=N'Database Maintenance', 
        @owner_login_name=N'sa', 
        @notify_email_operator_name=N'ICT Helpdesk', @job_id = @jobId OUTPUT
select @jobId
GO
EXEC msdb.dbo.sp_add_jobserver @job_name=N'MSOL Recycle SQL Server Logs', @server_name = N'VMXYZ'
GO
USE [msdb]
GO
EXEC msdb.dbo.sp_add_jobstep @job_name=N'MSOL Recycle SQL Server Logs', @step_name=N'RecycleErrorLog', 
        @step_id=1, 
        @cmdexec_success_code=0, 
        @on_success_action=1, 
        @on_fail_action=2, 
        @retry_attempts=0, 
        @retry_interval=0, 
        @os_run_priority=0, @subsystem=N'TSQL', 
        @command=N'EXEC sp_cycle_errorlog', 
        @database_name=N'master', 
        @flags=0
GO
USE [msdb]
GO
EXEC msdb.dbo.sp_update_job @job_name=N'MSOL Recycle SQL Server Logs', 
        @enabled=1, 
        @start_step_id=1, 
        @notify_level_eventlog=0, 
        @notify_level_email=2, 
        @notify_level_netsend=2, 
        @notify_level_page=2, 
        @delete_level=0, 
        @description=N'', 
        @category_name=N'Database Maintenance', 
        @owner_login_name=N'sa', 
        @notify_email_operator_name=N'ICT Helpdesk', 
        @notify_netsend_operator_name=N'', 
        @notify_page_operator_name=N''
GO
USE [msdb]
GO
DECLARE @schedule_id int
EXEC msdb.dbo.sp_add_jobschedule @job_name=N'MSOL Recycle SQL Server Logs', @name=N'Weekly Saturday 12:00', 
        @enabled=1, 
        @freq_type=8, 
        @freq_interval=64, 
        @freq_subday_type=1, 
        @freq_subday_interval=0, 
        @freq_relative_interval=0, 
        @freq_recurrence_factor=1, 
        @active_start_date=20150326, 
        @active_end_date=99991231, 
        @active_start_time=120000, 
        @active_end_time=235959, @schedule_id = @schedule_id OUTPUT
select @schedule_id
GO



Wednesday, March 25, 2015

SSIS 2012 package with Odata source–execution failed (part 3)

In post SSIS 2012 package with Odata source-failure (part 2) I was able to deploy my package with Odata source. But the package failed to run.

Change from Project deployment to Package Deployment

I have converted my project with package connections from project deployment to Package deployment. Then I deployed the package on the server in MSDB catalog from SSIS. I.e. I connect with SSMS to Integration Services as an administrator and under the MSDB folder I added a new folder for my package. I then imported the dtsx into the mdsb package store. The Import worked. Still from within SSMS and connected to the Integration Services, i right-clicked on the package in the MSDB store and selected 'Run package'. In the Execute Package dialog window, I clicked on 'Execute' and the package actually ran. The Odata connector was opened, it read the data and imported into my database.

This experience made me realize what the real culprit  was: The OData connector is a 32-bit connector in SQL Server 2012. 

Odata source is 32-bit

To prove this: with SSMS I connected to the SQL database server and in the Integration services catalog, i right-clicked on the package that was deployed in Project deployment mode, then selected 'Execute...'. In the execute dialog windo in the 'Advanced' tab I checked 32-bit runtime. And the package ran.

Final conclusion

  1. The Project deployment will work, but not with the Odata source as project connection. The Odata connection must be defined in each package as a package connection.
  2. Run the deployed packages in 32-bit mode on the server

Tuesday, March 24, 2015

Process SSAS cube in good order with SSIS package

Based on an article by Benny Austin about SSIS Package to Process SSAS Cube

I have decided to process the cubes I have to maintain in the same order as suggested in his article.

  1. Process Dimensions
    1. Process default
    2. Process Update
  2. Process Facts
    1. Process default
      1. Process Partitions – default
      2. Process Measure groups – default
      3. Process Cubes – default
    2. Process Data
      1. Process Partitions – data
      2. Process Measure groups – data
      3. Process Cubes – data
  3. Process Index
    1. Process Dimensions - index
    2. Process Cube measures - index
    3. Process Partitions – index

A screenshot of what the SSIS package looks like in VS 2012:

image

Tuesday, February 10, 2015

SSIS 2012 increment variable in expression task

With SSIS 2012 the Expression Task is a very useful addition.

image

It can replace some of the tedious little script tasks when all you want to do is to change the value of a variable.

However! Although the programming language of C# in scripts can look very similar to the expression language used in SSIS it can be misleading.

Today I needed a counter in a foreach loop container that needed to be incremented by one if an error was found in a preceding task. I created a variable of type int that was initiallly set to zero. Inside the container I added an Expression Task to increment my counter by 1.

There I made my mistake: being used to incrementing with C# I wrote the following (incorrect) statement:

@[User::Errorcounter] += 1

This statement will evaluate to a value of 1 when you click the ‘Eveluate Expression’ button. since my variable was initially set to zero, this looked OK to me.

image

However, the variable will always stay at value 1, it will never increment!

The correct increment statement is (in a VB-like or SQL-like form)

@[User::Errorcounter] = @[User::Errorcounter] + 1

This also evalutates to 1, but it will increment to 2, 3 and so on when the package is run.

By the way: this can easily be verified when you set the initial value of the variable Errorcounter to another value, e.g. 5. With the incorrect statement the expression will still evaluate to 1. With the correct statement the expression will evaluate to 6.

Monday, January 26, 2015

Installing Master Data Services Add-in for MS-Excel version 2013

On a PC with an Office 2013 installation, the SQL Server 2012 MDS Add-in for MS-Excel will not install, the installer displays an error message that says Excel 2007 or 2010 is needed. The solution is simple, just download the SP1 version of the Add-in, you can find it here Microsoft® SQL Server® 2012 Service Pack 1 (SP1) Master Data Services Add-in For Microsoft® Excel®

The installation file to download for SP1 has exactly the same name as the version for Excel 2007/2010.

image

After download, run the MSI installation and when the installer has finished, you can see the new ‘Master Data’ item in the Excel menu

image

Hope this helps.