Jeff Prom's SQL Server Blog

Sharing knowledge and tips on SQL Server, Business Intelligence and Data Management.

Sorting a Chart in SSRS

Posted by jeffprom on March 20, 2015

Sorting a chart in SSRS sounds easy enough, but the sort option isn’t where you might expect to find it. In this example I am using Visual Studio 2012.

First let’s take a look at an unsorted chart. Even if you sort values in your underlying dataset, your chart will likely not be sorted.

ssrs chart unsorted

One might expect to find sort options by right clicking the chart and going to Chart Properties which will not work. You then might think, of course it must be on the Series Properties. However, that again will not work. So let’s take a look at how to sort the chart. Start by double clicking the chart which will bring up the Chart Data box. Now you will probably expect to find sort options by looking at Value options.. and you would be wrong yet again. Click on the drop down arrow under the Category Groups. Select Category Group Properties.

ssrs chart select category group

Here you will find a Sorting option. Click Add and select what you would like to sort on.

ssrs chart add sort order

Now you can see that we finally have a sorted chart.

ssrs chart final sorted result

Posted in SSRS | Tagged: , | Leave a Comment »

SSRS Chart Does Not Show All Labels

Posted by jeffprom on March 2, 2015

When creating a new chart in SSRS, you may find that not all of the labels are being displayed on either the vertical or horizontal axis. This will happen when the chart tries to determine how many labels it can show using the default setting. Sometimes this isn’t the best solution. For example, as you can see in my chart below there should be plenty of room to show all of the labels but it decides not to.

ssrs chart final sorted result

In this case we definitely want to show all of the labels. To do this we will need to adjust some settings. Double click on the label names on the chart.

ssrs select label names

On the right hand side of Visual Studio, you should now see the Chart Axis values on the Properties area.

ssrs axis properties

Under the Interval attribute, change the value from ‘Auto’ to 1. This will specify that we want to show labels on every occurrence.

ssrs set interval value

Re-run the report and now you can see that all of the labels are shown.

ssrs chart with all labels

Posted in SSRS | Tagged: , | Leave a Comment »

Importing Stock Quotes Into SQL Server Through SSIS

Posted by jeffprom on January 23, 2015

I’ve worked at a publicly traded company for the past year and a half and one thing I always thought would be fun to do was to download stock values through SSIS and use it on reports or display it in a dashboard. Sounds easy enough, but as it turns out it was a bit harder than I expected for a few reasons:

1. Finding a stock quote provider that worked well with SSIS was a challenge.
2. I wasn’t finding any other good tutorials on how to download or process stock quotes using SSIS.
3. Lack of time to research and investigate a solution.

However, all three of the above have finally been overcome and that is why I am posting this article in case others were hoping to do this as well. So let’s take a look at how this can be done. In my examples, I am using Visual Studio 2012 and SQL Server 2012. If you are going to load data for just one particular stock, there won’t be a massive amount of data, so it’s easy to just truncate the destination table and reload each day which is what I’ll show here.

First, decide what stock symbol you would like to use. In my examples we will use Microsoft (MSFT). Their stock dates back to March 13th, 1986 and provides us with some good data. To get historical stock quotes, we are going to use an API from Yahoo Finance. We will make a URL call to download a csv file which accepts a stock quote parameter.

Here is what it looks like: http://ichart.finance.yahoo.com/table.csv?s=MSFT
The csv file includes data starting the day before (yesterday) and goes back to the fist IPO.

Now that we have a way to get stock quotes, let’s create an SSIS package and work on processing the data. Retrieving data through an HTTP connection is the next challenge we need to overcome. SSIS has an option to create a HTTP connection under Connection Managers but unfortunately it will not accomplish what we need. Let’s take a different approach. Start by creating two variables:
1. LocalFilename (point to a local folder and filename)
2. RemoteFilename (http://ichart.finance.yahoo.com/table.csv?s=MSFT)

variables used

Next, add a Script Task to the Control Flow. Edit the Script Task and add the two variables we just created to the ReadOnlyVariables.

add script task

Click on ‘Edit Script’. Under the public void Main() section, add the following:

Variables varCollection = null;

Dts.VariableDispenser.LockForRead(“User::RemoteFilename”);
Dts.VariableDispenser.LockForRead(“User::LocalFilename”);
Dts.VariableDispenser.GetVariables(ref varCollection);

System.Net.WebClient myWebClient = new System.Net.WebClient();
string webResource = varCollection[“User::RemoteFilename”].Value.ToString();
string fileName = varCollection[“User::LocalFilename”].Value.ToString();
myWebClient.DownloadFile(webResource, fileName);

Dts.TaskResult = (int)ScriptResults.Success;

script task code

At this point you can run the package and it should download the csv file to your local folder.

You can open the csv file and see that there are 7 columns (Date, Open, High, Low, Close, Volume, Adj Close).
(CAUTION: If you leave the csv file open and try to run the package again, you will get an obscure error because the script task cannot get a lock on the file)

error message

Now is also a good time to create a table where we can store the incoming data. Here is an example of the table I created. Feel free to modify it to fit your needs.

CREATE TABLE [dbo].[FactStockValue](
[StockValueKey] [int] IDENTITY(1,1) NOT NULL,
[DateKey] [int] NULL,
[StockSymbol] [varchar](5) NULL,
[OpenAmount] [money] NULL,
[HighAmount] [money] NULL,
[LowAmount] [money] NULL,
[CloseAmount] [money] NULL,
[Volume] [int] NULL,
[AdjustedCloseAmount] [money] NULL,
CONSTRAINT [PK_FactStockValue] PRIMARY KEY CLUSTERED
([StockValueKey] ASC)
)

Instead of storing just the date value, I convert the date to a smart key value which can then join to a date dimension table. Feel free to skip this step and store just the date if you like.

Now let’s hop back into the SSIS package and add a Data Flow Task. Within the Data Flow Task, add a Flat File Source and create a new connection that points to the local csv file. Since the csv file doesn’t include the stock symbol, I added a Derived Column transform to specify the symbol we are processing.

derived column

Next, I added a Data Conversion transform to convert Date to a database date [DT_DBDATE] which was needed to do a lookup on my date dimension table, and to convert the stock symbol value from unicode to string [DT_STR].

data conversion

In my example, I am doing a date lookup on a date dimension table to get a DateKey value. Finally, end with an OLE DB Destination which points to the destination table. Run the package and verify that it works.

data flow success

Check the destination table and verify that everything looks correct.

table results

Now that we have a way to retrieve and store stock data values, we can use it on dashboards and reports!

stock value chart

Posted in SSIS | Tagged: , | 1 Comment »

Cleansing Data Through SSIS with DQS

Posted by jeffprom on January 13, 2015

While Data Quality Services (DQS) takes some time to build up an effective knowledge base through manual data review, there are times when it sure would be handy to use SSIS. This is only recommended when you need quick results and are feeling pretty good about your knowledge base or are perhaps using an external knowledge base through reference data. Should you decide to cleanse data through SSIS using DQS, it’s important to note that it can export some quality control metrics such as Status and Confidence score with the final data which can later be used to review the results and make final decisions. So let’s take a look at using the DQS Cleansing transform in SSIS.

In this example, I created a basic SSIS package with one Data Flow. I selected an Excel Source and pointed it to a file that has some data to be cleansed. In my example I am going to send in some bad addresses and attempt to cleanse them using Melissa Data. To learn more about the Melissa Data Address Check add-in and reference data in general, view my other blog post here.

Create a new package, add a Data Flow Task, and then add a DQS Cleansing transform object.

dqs cleansing transform setup

Click on the New button next to the Data quality connection manager to setup a new connection to an instance of DQS and then select a knowledge base.

dqs cleansing transform setup with connection

Click on the mappings tab. Here you will map input columns from your data source to the domains on the knowledge base.

dqs cleansing transform with mappings

If you would like to see additional columns on the final results such as the Confidence score and the Reason then click on the Advanced tab.

dql cleansing transform advanced tab

When finished, click OK and test it out. In my example I am just going to a Union All object as my destination for demo purposes but I can still see the resulting cleansed data through the Data Viewer.

dqs cleansing results from transform

As you can see from the results, two addresses were corrected, two had suggestions, and one failed to be cleansed. We could get fancy and split up the results based on the Status values. Corrected ones may be considered ready to ship. Auto Suggest records may want to be reviewed, and the New ones would need to be reviewed since it was not found in the knowledge base.

Using the DQS Cleansing transform in SSIS is useful but doesn’t eliminate the need to review the final cleansed results. Cleansing data through SSIS is dependent on the quality of the underlying knowledge base but does provide a quick and efficient way to do an initial data cleanse. The resulting cleansed data can then be evaluated using the Status, Reason and Confidence score values. Once the data has been processed and reviewed, only then should it be considered cleansed and ‘ready for use’.

Posted in DQS, SSIS | Tagged: , , | Leave a Comment »

Cleansing Address Data With Data Quality Services (DQS) and Melissa Data

Posted by jeffprom on January 9, 2015

In this post I will show how to cleanse address data using Microsoft Data Quality Services (DQS) and Melissa Data as a reference data service. First, you need to make sure you have a working installation of DQS. If you don’t, you can view my other post here that walks you through installing DQS. Once DQS has been installed, you will need to visit the Microsoft Azure Marketplace website and subscribe to the Melissa Data Address Check service. When I wrote this post, you could get 1,000 addresses cleansed per month for free. This is useful for testing or if you have a small set of data to be cleansed. Select the desired service level/amount from the list and click the Sign-Up button. You will now have a subscription added to your account.

melissa data service on the azure marketplace

Next, we need to tell DQS about our Azure Marketplace account. On the Microsoft Azure Marketplace website, click on ‘My Account’ and look for the Primary Account Key value.

azure marketplace ID

Highlight and copy the Key value. Open the Data Quality Client tool and click on Configuration on the right-hand side. Paste your Key value into the DataMarket Account ID entry box and click on the validate button.

dqs validate account ID

Once your ID has been validated you are then ready to setup a knowledge base in DQS that will use Melissa Data. On the DQS client, click on the Close button on the Configuration screen to get back to the main menu. On the top left, click on New Knowledge Base. Add 4 new domains (Address, City, State, Zip). Next, add 1 composite domain (FullAddress). Add the first 4 domains you just created (Address, City, State, Zip).

setting up the knowledge base domains

With the FullAddress composite domain selected, click on the Reference Data tab. Click on Browse and check the Melissa Data service.

On the Schema mapping table, create the following mappings between the RDS Schema and Domains:
1. AddressLine (M) = Address
2. City = City
3. State = State
4. Zip = Zip

adding melissa data as reference data

You can adjust several of the Providers Settings such as Auto Correction Threshold, Suggested Candidates, and Min Confidence. Once finished, click the Finish button and select Publish.

save knowledge base with reference data

Next, we will cleanse a set of data using our new knowledge base. On the main screen of the DQS client, click on New Data Quality Project. Give it a name, and be sure to select the knowledge base we just finished setting up that uses the Melissa Data reference data.

new data quality project

Click Next and select a data source that points to address records you would like to cleanse. Map the 4 Source Columns to the appropriate Domains. If you have mapped all 4 correctly, you should be able to click on the ‘View/Select Composite Domains’ button which tells you that the composite domain will now be used.

data quality project - map domains

Click Next, and Start. DQS will analyze the data and provide a summary of results.

cleansing profile results

Click Next to see the cleansing suggestions and to ultimately work through corrections. In my example I am using 5 addresses which are Melissa Data office locations that I found on their website. I altered some aspects of the addresses to see how well it can cleanse the data. Let’s take a look at the results.

record results before adjustments

As you can see here, DQS suggested 2 corrections (Road to Rd, Drive to Dr). The interesting thing is that both of these corrections don’t match what is on their website. (They have Road and Dr.) Perhaps they need to cleanse their own addresses. :) It also listed one entry as Invalid even though all I did was add a 1 to the address value. This record would need to be manually reviewed/corrected or we could adjust the threshold settings to see if we get better results. DQS successfully corrected 2 records with a good confidence score. It wasn’t fooled by me changing the City name on one and Zip on the other.

approved results

Work through the results by making adjustments, approving or rejecting and then finally exporting the records to it’s final destination (SQL Server, CSV, or Excel).

Address cleansing can be an important yet difficult task. However, with DQS we now have the ability to tap into external Knowledge Bases such as Melissa Data to help. By leveraging reference data within DQS we can quickly and effective cleanse address information.

Posted in DQS | Tagged: , | 1 Comment »

SQL Saturday #350 – Winnipeg

Posted by jeffprom on October 28, 2014

sqlsat350_web

On Saturday, November 22nd I will be at the Winnipeg SQL Saturday #350 event and will be giving a presentation on SQL Injections. The event will be held at the Red River Community College – downtown, 160 Princess St., Winnipeg, MB R3B 1K9, Canada. This is a great opportunity to learn more about SQL Server and I highly encourage everyone to attend.

Here is the topic I will be presenting on:

SQL Injections and How To Stop Them

SQL injections are a technique where malicious users can inject SQL commands into a SQL statement, through web page input. If left unprotected, SQL injections can cause havoc with web applications by compromising security, manipulating data, hoarding system resources, retrieving sensitive data, removing databases and database objects, or taking databases offline! This session will show how SQL injections work, identify if you are being attacked, and cover various prevention techniques and their effectiveness.

To register for this event and to find additional information, you can visit the SQL Saturday Winnipeg page here:

https://www.sqlsaturday.com/350/eventhome.aspx

 

Posted in Events, Security | Leave a Comment »

PASS Summit 2014 Here I Come!

Posted by jeffprom on October 16, 2014

PASS_14_Attending_250x250

I am fortunate again this year to have the opportunity to attend the PASS Summit conference. Big huge thanks and shout out to my current employer (Silver Bay Realty Trust Corp) for sending me. This will be my 3rd time attending and coincidentally all three of my visits have been to Seattle, WA which is great because Seattle is a very fun city to visit. If it’s anything like my previous visits, it should be a great conference filled with a lot of great learning and networking opportunities. Hope to see you there!

 

Posted in Events | Leave a Comment »

SQL Saturday #332 – Minnesota 2014

Posted by jeffprom on September 5, 2014

sqlsat332_web-white

This year I have been selected to present at SQL Saturday #332 Minnesota on what I consider to be a very fascinating topic; SQL Injections. If left unprotected, your web applications can be severely compromised by SQL Injection attacks. To learn more about SQL Injections, I invite you to attend my session on October 25th, 2014.

SQL Injections and How To Stop Them

SQL injections are a technique where malicious users can inject SQL commands into a SQL statement, through web page input. If left unprotected, SQL injections can cause havoc with web applications by compromising security, manipulating database data, hoarding system resources, retrieving sensitive data, and even removing database objects like the database itself! This session will show how SQL injections work, identify if you are being attacked, and cover various prevention techniques and their effectiveness.

There is a fantastic list of presenters at this event! Secure your spot today by signing up on the SQL Saturday website.

http://sqlsaturday.com/332/eventhome.aspx

 

Posted in Events, Security | Leave a Comment »

SQLSaturday #291 – Chicago 2014

Posted by jeffprom on March 19, 2014

I am happy to announce that I was selected to speak at SQLSaturday Chicago! This event will be held April, 26 2014 at 1221 N Swift Road, Addison, IL 60101. I will be presenting on Master Data Services.

Master Data Services
Does your company have similar data in multiple locations? Not really sure which set of data is accurate? With MDS, Microsoft gives you the ability to have one accurate source of non-transactional data to use with your applications or send to other databases and ETL processes. Leverage data stewards to help manage the data repository to ensure it is up-to-date and accurate. In this session we will cover MDS from start to finish including installation, creating models/entities, working with data, using the Excel add-in, security, hierarchies, views, versions, and business rules. After this session, you should have a good understanding of what MDS does, how to use it, and how to get started with your own MDS project.

Additional information can be found here:
http://sqlsaturday.com/291/eventhome.aspx

Be sure to register for this event today!

Posted in Events, MDS | 2 Comments »

Generating Dynamic CRUD Queries Within Microsoft Excel

Posted by jeffprom on February 21, 2014

Last year I learned a neat trick from Ross McNeely. We kept getting ad-hoc Excel documents sent to us and we needed to update the databases with these values. Traditionally you might go through the import/export data wizard, or create an SSIS package to import the data. This usually leads to data conversion issues, possible temp tables, writing extra queries, and somtimes all around headaches.

This method shows how to write CRUD sql statements within Excel that can then update the database. This is a really quick way to take a handful of fields and update data with little effort. If you have a lot of fields you may want to consider going the more traditional route because it could take some time to write long query statments in Excel. That said, I have used this technique probably 100 times so I decided to post about it. Let’s take a look at how it works.

I have two examples here that will update two tables in the Adventure Works DW database. Let’s pretend that someone has just sent us an Excel file and needs product information updated in the DimProduct table. This is what the data looks like.

First, find an open cell off to the right (F) and begin writing the sql query as shown here. Begin with =” and place single quotes where the sql statement will require them. In order to reference other data from the row, place double quotes, an ampersand and then the cell location. To continue the rest of the statement simply place another ampersand and double quote.

Continue until the rest of the statement is complete. Here is what it should look like:
=”update DimProduct set EnglishProductName='”&B2&”‘, StandardCost='”&C2&”‘, FinishedGoodsFlag=”&D2&”, Color='”&E2&”‘ where ProductAlternateKey='”&A3&”‘”

Once finished, click off of the query statement. If it worked correctly you should now see the query populated with values. If you received some errors, go back and double check the statement. Syntax issues are the most common mistakes.

Now that we have one statement that looks correct, let’s quickly fill in all the other rows. Single click the cell with the query (F2), and then double click the green square on the lower right. This will populate the rest of the rows with the same query template.

After the rest of the rows are populated, you should now have a complete set of queries.

One last trick to note is when using dates. You will need to use the text() function as shown here and specify a date format.
=”update FactInternetSales set ShipDate='”&TEXT(C2,”yyyy-mm-dd”)&”‘ where SalesOrderNumber='”&A2&”‘ and SalesOrderLineNumber=”&B2

The final step is to copy the query statements from Excel and paste them into SQL Server Management Studio where you can simply execute them to update the database. In the previous screenshot you would click on column D, copy, and then paste into Management Studio, and execute.

And there you have it, writing queries directly within Excel provides a quick way to take data from an Excel spreadsheet and manage data within your database with very little effort.

Posted in Excel, T-SQL | Tagged: , | Leave a Comment »

 
Follow

Get every new post delivered to your Inbox.