Tuesday, December 30, 2014

Embedding a simple Bing map in a blog post

Simple test to embed a bing map in a Blogger blogpost with Windows Live Writer.

  • open maps.bing.com
  • search a location
  • click on the ‘Share’ link at the upper right hand side of the map
  • in the share dialog, click ‘Customize and preview’
  • In the customiez embedded maps dialog, click on the map to place a pushpin (blue dot)
  • Then click the Generate Code button, copy the code in this dialog window
  • In windows Live Writer 2012, click on the ‘Source’ tab at the bottom of the Window.
  • In the source view, paste your code
  • Publish to your blog.

Friday, December 05, 2014

Test RESTful API with Postman REST Client


The Postman REST client aids in developing client software that needs to send and capture requests from RESTful web service APIs.

It is also a useful tool for the developer of the API because he can setup test examples, store them in text files and give them to the client developer as an example to develop his client and as an aid in debugging his client software.

Here you can read the Postman documentation

Install Postman REST client in MS-Windows with Google Chrome

  • Start Google Chrome
  • Go to the extensions page chrome://extensions/
  • Click on Get more extensions
  • In the find text box type: postman rest client
  • The result is shown her
  • In the ‘Postman – REST client (packaged app)’ frame, click on the FREE button.
  • In the ‘Confirm New App’ dialog box, Click on ‘Add’ to install.
  • The browser displays The Postman – REST client has been add to the new Chrome App Launcher
  • You can effectively see the App Launcher in your Windows Taskbar. When you click on it the Launcher opens and the ‘'Postman” App is available
  • Now you can launch Postman from the Launcher by clicking on its icon

Test a RESTful API with Postman

Friday, November 28, 2014

Connecting to a pre-SQL 2012 SSIS Instance with SQL 2012 SSMS

when I tried to connect to a SSIS SQL 2008 R2 instance with SQL 2012 SSMS I got the following error:

Connecting to the Integration Services service on the computer “MyServerName” failed with the following error: “Class not registered”.

This error can occur when you try to connect to a SQL Server 2005 Integration Services service from the current version of the SQL Server tools. Instead, add folders to the service configuration file to let the local Integration Services service manage packages on the SQL Server 2005 instance.

I found a workaround proposed in this article by Phil Brammer

In short, you have to edit the C:\Program Files\Microsoft SQL Server\110\DTS\Binn\MsDtsSrvr.ini.xml file on your SSIS 2012 instance server. Then add folder entries with new folder names that point to the Pre-2012 version of SSIS instance. After restarting the SSIS service on your SSIS 2012 instance, start SSMS 2012 and connect to the the SSIS 2012 instance (NOT to the SSIS 2008). In this instance you will see the new folders, and when you fold open the folders you will see the packages installed in the SSIS 2008 instance.


Tuesday, November 18, 2014

Add administrators to reports site after installation of Reporting Services

Add administrators to reports site after installation of Reporting Services


After installation and setup of the reporting services, when you browse to the reports site, you receive this error:

“User 'Domainname\Username' does not have required permissions. Verify that sufficient permissions have been granted and Windows User Account Control (UAC) restrictions have been addressed.”


You cannot even assign access because the Site settings link on the top right hand side is not available.


2 steps need to be performed:

  • Add a windows user or group to the System Role Assignments and grant System Administrator rights.
  • Add a role for this user or group at the root folder of the Report Server and grant
  • Log on locally to Windows Server running the Reporting Services with an account that is member of the local admin group.
  • From the start button, run IE as administrator
  • clip_image003[4]
  • In the UAC dialog, click Yes
  • clip_image005
  • In the IE address bar, browse to http://localhost\reports
  • This there is no error message and the Site Settings link is available
  • clip_image007
    Click on the Site Settings link, Click on the Security tab in the left hand pane
  • clip_image009
  • Click New Role Assignment in the toolbar above the list of groups and Users.
  • In the New System Role Assignment page, add a Windows Group or username and assign a role, check System Administrator and System User. Click OK.
  • clip_image011
  • The new role is listed under Group or User.
  • clip_image013
  • Browse to the Reporting Services Home page, by clicking the Home link at the top of the page.
  • Now you are in the Root folder of the Reports. In the toolbar, click Folder Settings.
  • clip_image015
  • In the Folder settings, there is only one tab, namely ‘Security’. Click on New Role Assignment.
  • clip_image017
  • In the New Role Assignment page, type a group or username, check Content Manager. Click OK to create.
  • clip_image019
  • The new role is assigned and listed in the list of Group or User.
  • clip_image021

With these 2 actions we granted a windows group or user administrative rights on the report server. From now on you can browse to the reports site on any client PC, without the need to run as administrator.

To add users that can only browse reports

The following steps are needed:

  • Grant access to the root folder of the reports site.
  • Grant access to the Folder with the reports that the user needs to consult.

Friday, November 07, 2014

On SQL Server 2012 and 2014 Use ALTER ROLE ADD MEMBER instead of sp_addrolemember

While creating some scripts in a test environment to create logins and add members I noticed that ALTER ROLE has been upgrade with the ADD MEMBER option. this option replaces the stored procedure sp_addrolemember used in SQL 2008 R2 and previous versions.

The syntax for ALTER ROLE has now been upgraded with the ADD MEMBER option:

ALTER ROLE role_name
      [ ADD MEMBER database_principal ]
    | [ DROP MEMBER database_principal ]
    | WITH NAME = new_name

For instance on a SQL Server 2008 R2 granting logins to a Database would be like this:

USE [AdventureWorks]
CREATE USER [Mydomain\MyAccount] FOR LOGIN [Mydomain\MyAccount]
EXEC sp_addrolemember @rolename = 'db_datareader', @membername = 'Mydomain\MyAccount'

With SQL 2012 or 2014 you can now create logins with the following script:

USE [AdventureWorks]
CREATE USER [MyDomain\MyAccount] FOR LOGIN [MyDomain\MyAccount]
ALTER ROLE [db_datareader] ADD MEMBER [MyDomain\MyAccount]

On the MSDN library site the SQL Server 2012 documentation for sp_addrolemember warns that this will be phased out in a future version

Tuesday, October 28, 2014

Make a dynamic selection from SharePoint list data with the SSIS 2012 OData Source


Use SSIS to extract, transform and load data from a SharePoint list into a SQL Server table. Extraction from the SharePoint lists is done in a Data Flow Task by means of the OData Source for SSIS 2012. The Data must be dynamically extracted, i.e. only List data from the last <n> days must be read.

The OData Source Editor connects to the SharePoint REST service.

The URL of the REST service: http://myserver/_vti_bin/ListData.svc

A name after the URL of the service returns SharePoint Foundation list data in XML format as a standard Atom feed: http://myserver/_vti_bin/ListData.svc/Incidents

.In the OData Source Editor:


To make a selection of data created since 2014-08-04, the Filter is set to : $filter=Created+gt+datetime'2014-08-04'


There is no way to add a parameter in the Query options text box. How can the filter on the Created date be made variable so that only the last 5 days are read for a daily upload.


There are no expressions for the Filter property in the OData Source properties. However, in the Data Flow Task that contains the OData Source, we have Expressions that can be used to dynamically set the Query property of the OData Source.

First of all create a Project parameter called ‘LoadDays’ of type Int32 with the value 10.

In the SSIS package add a variable called ‘LoadCreated’ of type DateTime, with an expression to subtract 10 days from the current date: DATEADD( "d", - @[$Project::LoadDays] , GETDATE()  )


In the Control Flow pane, select the Data Flow Task that contains the OData Source.


In the Properties Window, click on the ellipsis button of the Expressions property.


In the Property Expressions Editor window, in the Property column select the OData Source Query property, click on the ellipsis button to open the Expression Builder.


The expression we create will need to mimic an OData filter expression like $filter=Created+gt+datetime'2014-08-04'

In the expression builder, the expression is:

"$filter=Created+gt+datetime'" + (DT_WSTR, 4)YEAR(@[User::LoadCreated]) + "-" + ( MONTH( @[User::LoadCreated]) < 10 ? "0" : "") + (DT_WSTR,2)MONTH( @[User::LoadCreated]) + "-" + (DAY( @[User::LoadCreated])<10 ? "0" : "" ) + (DT_WSTR,2)DAY( @[User::LoadCreated]) +"'"

The OData filter does not accept date literals like ‘201-8-4’, we need to add a leading zero for the month and day part, hence the conditional operator ( MONTH( @[User::LoadCreated]) < 10 ? "0" : "")

Click OK to close the Expression builder.

Click OK to close the Property Expressions Editor.

With the Data flow task selected, click the Data Flow tab. Select the OData Source control and edit. The filter property now holds the calculated values of the expression:



When the SSIS package runs it will only load list data created after the calculated date.

Tuesday, October 21, 2014

A deadlock was detected while trying to lock variables for R/W access


While testing an Integration Service package with a File Watcher task as described in Using the Script Task in SSIS to Process Data Files When They Arrive, the following error was thrown:

The script threw an exception: A deadlock was detected while trying to lock variables <variablelist> for read access and variables <variablelist> for read/write access. A lock cannot be acquired after 16 attempts. The locks timed out.

This was developed as an SSIS 2012 project and deployed on a SQL server in the Integration Services Catalog. The package ran fine when I tested it in Visual Studio and when I executed the task from the Catalog. The error came when I create a SQL Server Agent job to run the task.

First I read some articles about variable locking in SSIS; but why did my package run without errors when I ran it under my account?


The cause of the problem: it is the account under which the package is run in the Job step that does not have read/write access to the folders where the File Watcher Task and the ensuing File System Tasks are doing their work. In this case the job step was run as the SQL Server Agent Service Account.

Either give this service account read/write access to the folders used by the SSIS package (may not be a good idea) or use a special windows account with sufficient rights and create a SQL Server Agent proxy for this account, see Use a windows account to grant SSIS packages folder and file access on a file share

Thursday, October 16, 2014

Error occurred while the attribute of the dimension from the database was being processed–snowflake dimension


While designing a new snowflake dimension in an Analysis services project, I received the following error message during deployment with Visual Studio of the project to the Analysis Server:

Errors in the OLAP storage engine: An error occurred while the 'Plant' attribute of the 'Device' dimension from the '<projectname>' database was being processed.

The snowflake dimension ‘Device’ is a combination of related tables: Country – Region – Plant – Device. The Dimension the same hierarchy as in the one-to-many relationship.

Each table is composed of a primary key Id column, a name column and various other properties. The snowflake dimension will be used for grouping and totalling production data, so that users can drill down from Country level to Regional, to Plant or to Device. The users do not know the Id values, they are shown the names.

The dimension attribute properties are setup to use the Id as Key, but to display the name and order the listed values by name, e.g. for Region

  • KeyColumns: Region.id
  • NameColumn: Region.Name
  • OrderBy: Name

After the initial design of this dimension I wanted to deploy to the server, so as to be able to test my design in the Dimension Browser. Then I received the abovementioned error message.

SOLUTION: look for NULL values in Foreign key column that refers to parent table

The error message was displayed in the Error List window in Visual Studio. The Output window shows some more information about the error.

Warning -2128674815 : Errors in the OLAP storage engine: The attribute key cannot be found when processing: Table: 'dbo_Region', Column: 'Id', Value: '0'. The attribute is 'Region'.

When I queried SQL database table Plant for foreign key RegionId with value 0, none were found. When I queried for foreign key field being NULL, I found one such record in the Plant table. This record happens to be the main office of the company and is there for other purposes, so it must not be selected.

  • Go to the Data Source View, right click on Plant table and select ‘Replace Table –> With New Named Query…
  • In the query designer add the condition WHERE RegionId IS NOT NULL
  • Save the query.
  • Deploy the project

The project will now deploy without any further error messages, unless you have other data to clean up.

Wednesday, October 15, 2014

Analysis services 2012: a connection cannot be made, ensure that the server is running

After the installation of a new SQL Server 2012 with Analysis Services, I found that I could reach the Analysis services database perfectly well when I was remotely logged in on the Server and connected with SSMS locally. When I tried to connect with SSMS from another computer  I received the error message:

A connection cannot be made. Ensure that the server is running. (Microsoft.AnalysisServices.AdomdClient)

On the internet some advice is given about this connection problem.On the server with Analysis Services 2012 the SQL Server Browser must be running under the LOCALSYSTEM account. On my Server the browser service was running under LOCALSERVICE. After I changed this I still could not connect remotely.

Another advice was to try to connect to the Analysis Service using <Servername>:2383, in order not use SQL Server Browser. But I still received the same error message.

Finally it dawned to me that the firewall was probably blocking this port. On the Server with Analysis Services 2012, open ‘Windows Firewall with Advanced Security’ and add a new inbound rule, for TCP port 2383, to allow the connection in the Domain. This rule is similar as the rule one must setup for SQL Server Service in port 1433.

open ‘Windows Firewall with Advanced Security’

Select Inbound Rules, In Actions, click on ‘New Rule…’


In the New Inbound Rule Wizard, Rule Type, select Port, click Next


In the Protocols and Ports, Select ‘TCP’, Specific local ports, type ‘2383’, click Next


In Action, Select ‘Allow the connection’, Next


In Profile, check only ‘Domain’ (because I only want to grant access from my corporate domain). Next


In Name, type the name: ‘Sql Server analysis service port’ (or some other name that allows you to recognize your inbound rule). Finish.


As soon as the Inbound rule is created, the connection from another computer to the analysis service will work.

Monday, October 06, 2014

Dates in Excel files rendered from reports are displayed as plain numbers

Problem description

A report subscription is setup to send an email message on regular schedule. The report is included in the email as an attachment file rendered in excel format.

When MS-Windows users receive the email, they can open the excel file and dates will be shown in a normal date format:


When iPad users receive the email, they can open the excel file, but the dates are shown as numbers:




Standard solution for MS-Windows users

In the report design, the date field, e.g. ‘ReportDate’ is displayed as a regular field

The textbox has a custom date format

The report is displayed with this custom format

And the export to excel will show formatted date cells when opened with MS-Windows

However on the iPad the date in column H will be shown as a number.


Workaround for iPad users

In the report design do not use a custom format property; instead use an expression with the Format function to represent the date:
=Format(Fields!ReportDate.Value, "dd/MM/yyyy")


The result from the report subscription email attachment looks like this on the iPad:




With the workaround, the MS-Windows user will see the dates visualized correctly, but the cells are treated as plain text, not as dates. The cells are also left-aligned like any standard text.

Thursday, August 28, 2014

Test LightSwitch OData services with Fiddler v 2.4.x

Make a visual Studio 2013 Lightswitch app based on the Northwind SQL database.

Application name: NorthwindLS

Data Source: NorthwindData

The Lightswitch app is published to an IIS server ‘myhost’

Normally the media format of the OData will be ‘atom’.

To show result in JSON format with Fiddler

Start Fiddler v 2.4.x

In the composer tab next to the GET command, enter the URL of the Odata service published by the Lightswitch app: http://myhost/NorthwindLS/NorthwindData.svc

In the textbox under the GET command, there are 2 lines of text that are sent to the server

User-Agent: Fiddler
Host: myhost

Add a 3rd line: Accept: application/json;odata=verbose


click the Execute button to execute the Get command. The response will be in JSON format.

Friday, August 22, 2014

Environment variable in SSIS not being recognized

Article about Environment variable in SSIS not being recognized by James Serra

If the SSIS package is being scheduled to run in a SQL server job then the SQL server agent service needs to be restarted to recognize the environment variable change.

Thursday, August 14, 2014

Change from Package connections to Project connections while upgrading packages to SSIS 2012

Migration of SSIS 2008 project to SSIS 2012. All packages have package connections, that need to be replaced by project connections to simplify maintenance. Problem is to go into the design of each and every package and replace every single use in source or destination or in SQL tasks.

Do not change the packages yet with package connections.

Make a backup of each package file!!!

Create all the new project Connections.

Create a new package where you will use all the Project connections, just for reference. A dataflow with source for each connection will suffice. We’ll need this.

Save the package and close it. right-click on it and select <> view code

In the XML code you will find the <connections>

    refId="Package\Data Flow Task\OLE DB Source.Connections[OleDbConnection]"
    description="The OLE DB runtime connection used to access the database."
    name="OleDbConnection" />

Inside a <connection> we find the project connectionManagerID and the connectionManagerRefId

For each Package with package connections, use the editor to find each source or destination that uses the package connection

    refId="Package\Data Flow Task\OLE DB Source.Connections[OleDbConnection]"
    description="The OLE DB runtime connection used to access the database."
    name="OleDbConnection" />

Replace the connectionManagerID and the connectionManagerRefId by the project connectionManagerID and the connectionManagerRefId.

Wednesday, July 09, 2014

SSIS 2012 new REPLACENULL() function

From MSDN documentation – SQL server 2012


Returns the value of second expression parameter if the value of first expression parameter is NULL; otherwise, returns the value of first expression.

This function was added to SSIS 2012 and is a useful addition and replaces lengthier expressions needed in SSIS 2008 or 2005.

E.g. after a merge join from 2 data sources A and B, in a conditional split 2 Boolean column values from both tables need to be compared, but sometimes the values on either side can be NULL. In case the value is NULL it is considered to be false.

  • If values where NOT nullable: IsValueA == IsValueB
    • When either has a NULL value the comparison will throw an error
  • with SSIS 2008: (DT_BOOL) (ISNULL(IsValueA) ? false : IsValueA) == (DT_BOOL) (ISNULL(IsValueB) ? false : IsValueB)
  • with SSIS 2012: REPLACENULL(IsValueA, false) == REPLACENULL(IsValueB, false)

Remark : in the above example I have accepted that if IsValueA is null while IsValueB = 0 the values are not technically equal, but in business terms they are.

Wednesday, June 25, 2014

SSAS The connected user is not an Analysis Services server administrator.


On a new development system where my windows account has local admin rights, and sysadmin rights in the SQL Server 2012 with SSAS installation.

With SQL server Data Tools 2010 I want to create and deploy an Analysis Services Data mining project. When the Data source, views and Mining structures are setup, I want to deploy on the server for the first time, But I get this error message:

The connected user is not an Analysis Services server administrator.


Reason of the problem is the UAC on de development OS. Solution is to add my windows account explicitly as an Analysis Services System account. But when I use SSMS to connect to analysis services and try to add my name I will get a similar error message.

the solution is:

  • run SSMS as an administrator
  • connect to the Analysis Services Server
  • right-click on the server name and select ‘Properties..’
  • In the Security page of the Server Properties window, click on the ‘Add…’ button to add a new account to the Server administrators, select your own windows account and add to the servers.

By explicitly adding your own account to the Analysis Services Server administrators, you can now create and deploy projects to the Analysis Services Server without running SSMs or SQL Server Data Tools.

SSAS error Unable to connect to the localhost server. Make sure that the server is started.


An error message on a new Development system with SQL Server 2012. The first time I created a SSAS Data Mining project with BI studio 2010.

When the Data source connection and a view was made, I switched over to the Mining Model Viewer, and got this error message:

‘Unable to connect to the localhost server. Make sure that the server is started.‘


This occurs because the SQL server was setup with a named instance. When you create a new Analysis Services project, the project defaults to the ‘localhost’ as Deployment Server.


Localhost is fine as long as the default instance is the standard instance.

To fix the problem, we need to set the Deployment Server in the project to the Named instance of the local SQL server.

In the Solution Explorer, right-click on the Project and select Properties…


In the Project property Pages tree view, under Configuration Properties, Select Deployment


In the pane on the right-hand side, in the Server property replace ‘localhost’ by the named instance, e.g. ‘MyServer\MyInstance’ and click OK to close the dialog.

In the Mining Model Viewer, click on the little refresh button next to the empty viewer dropdown box.


then the name of the Viewer selected during the Views setup appears in the dropdown together with a dialog box to build and deploy the project.


From then on you can continue your Analysis Services project.

Wednesday, June 04, 2014

A quick way to convert integer to nchar right aligned with leading zeroes

For an existing database table I urgently needed to translate integer values into nchar with leading zeroes

e.g. for an nchar destination column of length 2:

  • convert value 1 to ‘01’
  • convert value 2 to ‘02’
  • convert value 11 to ‘11’

The simplest solution that came to my mind at that moment was to add an integer value that was a multiple of base 10 with enough trailing zeroes to act as leading zeroes in the nchar field. In cas of the nchar(4) column I need 10000, in case of nchar(2) I need to add 100.

  • convert value (100 + 1) to ‘101’
  • convert value (100 + 2) to ‘102’
  • convert value (100 + 11) to ‘111’

The result has one character too much, therefore I need to subtract this first character sith the SUBSTRING function:

  • SUBSTRING(‘101’, 2, 2) to ‘01’
  • SUBSTRING(‘102’, 2, 2) to ‘02’
  • SUBSTRING(‘111’, 2, 2) to ‘11’

T-SQL example: I have a dimension table Dim_YearMonth that needs to be populated with data from a Dim_Year and Dim_Month table. The table Dim_YearMonth has one nchar(7) column calle [YearMonthName] that is used as display value in dropdown lists. this has to be populated with datalike: ‘2014-01’, ‘2014-02’.

I have a table Dim_Year with one smallint column YearId ranging from 2005 to 2025.

There is a second table Dim_Month with smallint column MonthId ranging from 1 to 12.

Part of the T-SQL statement to combine year and month into ‘YYYY-MM’ is:

, CAST(y.YearID as nchar(4))+ '-' + SUBSTRING( CAST( (100+m.MonthId) as NCHAR(3)),2,2) as YearMonthName

The T-SQL statement to select Year and month combined and insert into the table Dim_YearMonth:

select (y.YearID *100 + m.MonthId) as YearMonthId
    , CAST(y.YearID as nchar(4))+ '-' + SUBSTRING( CAST( (100+m.MonthId) as NCHAR(3)),2,2) as YearMonthName
    , YearID, MonthId
    from dbo.Dim_Year as y, dbo.Dim_Month as m
The result looks like this:

YearMonthId    YearMonthName    CurrentYear    CurrentMonth
201308         2013-08          2013           8
201309         2013-09          2013           9
201310         2013-10          2013           10
201311         2013-11          2013           11
201312         2013-12          2013           12
201401         2014-01          2014           1
201402         2014-02          2014           2
201403         2014-03          2014           3

Thursday, May 22, 2014

Compare 2 SQL tables to find missing rows fix the difference.


I have 2 database tables in 2 different databases and they should contain the same data. In this particular case the data is inserted by an application that inserts the data row by row. The primary key is of type uniqueidentifier with default value newid(), so it will never be the same in both tables. The table should have the same number of rows and all columns should contain identical data.


The T-SQL clause EXCEPT can help to verify if the tables contain the same rows.

From the SQL-BOL: EXCEPT returns any distinct values from the left query that are not also found on the right query.


Compare all columns:

SELECT * FROM db1.dbo.Product
SELECT * FROM db2.dbo.Product

This will return all rows in table of db1 that are not found or different from the table in db2.

But with the primary key column that is always different (because the guid is generated per table and per row) all rows will be returned, even when the number of rows is equal and all the other columns are equal.

To compare the columns that are different except the primary key column we slect the columns by name, except the primary key column:

SELECT [name], [price], [categoryid], [supplierid] FROM db1.dbo.Product
SELECT [name], [price], [categoryid], [supplierid] FROM db2.dbo.Product

In this case all the rows that are in db1 but not in db2 will be show.

To add the missing rows to the db2 table (the primary key has default value newid()):

INSERT INTO db2.dbo.Product ([name], [price], [categoryid], [supplierid])
SELECT [name], [price], [categoryid], [supplierid] FROM db1.dbo.Product
SELECT [name], [price], [categoryid], [supplierid] FROM db2.dbo.Product

Tuesday, May 13, 2014

SQL 2008 Create a unique index on field with NULL values

According to ANSI standards a UNIQUE constraint must disallow duplicate non-NULL values but accept multiple NULL values. In SQL Server a single NULL is allowed but multiple NULL values are not.

In SQL Server 2008,  it is possible to define a unique filtered index based on a predicate that excludes NULLS:

ON MyTable(Mycolumn)

Thursday, May 08, 2014

Odata service with EF 6 and WCF services 5.6 fails with Request Error


I tried to follow a few more examples about building OData services with Visual Studio 2013, Entity Framework 6 and WCF Services 5.6, but when I tried to browse the Wcf (OData) service from VS 2013, it starts IE wiith an error message:

“Request Error. The server encountered an error processing the request. See server logs for more details.”


I tried various combinations, but to no avail (I even used another development system to try in case my development environment was botched up).


Eventually I found an anwser in stackoverflow.com foru, by Kourosh.

“it seems the entity framework 6 and wcf data services 5.6 , need some provider to work together. you can download the provider simply by using nuget package console manager :

Install-Package Microsoft.OData.EntityFrameworkProvider -Pre

it’s version is alpha 2, so in future, search for final release. it worked for me however.

final thing is instead of DataService, you need to use EntityFrameworkDataService. T is the name of your entities.”

Thank you Kourosh. Now my OData service with EF 6 and WCF data services 5.6 works!


Step 1: install the package

Step 2: edit the WCF webservice c sharp code and replace DataService with EntityFrameworkDataService.

using System.Data.Services.Providers;
namespace DataAccessMydata
    public class ODataMydata : EntityFrameworkDataService<MydataContext>

Wednesday, May 07, 2014

Adding query options to the SSIS OData Source


In my previous article about SSIS OData source I created an SSIS 2014 package with an OData source connected to the Odata.org sample webservice.

OData query options

The OData Source has query options that can be used to alter the collection that is being used. I have made some tests on these options.

Option Description
$expand Expands related entities inline
$filter Filters the results, based on a Boolean condition
$inlinecount Tells the server to include the total count of matching entities in the response
$orderby Sorts the results
$select Selects which properties to include in the response
$skip Skips the first n results
$top Returns only the first n the results

See also Odata uri conventions paragraph 4

Set query options in OData Source editor

In the Data Flow select and edit the OData Source. The ‘Query options:’ textbox can be used to set options



e.g. to order by the Product name: $orderby=Name. To Verify the result, click on the ‘Preview…’ button.



e.g. select all products with Rating 3: $filter= Rating eq 3


e.g. select products released in the year 2006: $filter= year(ReleaseDate) eq 2006


e.g. select discontinued products: $filter= DiscontinuedDate ne null



e.g. select top 5 products ordered by Name descending: $top=5&$orderby=Name desc



e.g. select only 2 columns on the products table: $select=Price, Name


$inlinecount, $format

Not usefull for SSIS purposes.


I will investigate this further once I have better samples than the odata.org samples..