Feeds:
Posts
Comments

Archive for August, 2011

A very common question we get at timeXtender, is why SSIS has to be installed on the same machine as timeXtender. The easy and straight forward answer to this is “as per design by Microsoft”. If you would like a few more details about this limitation, please read on.

SSIS vs. DTS

Data Transformation Services (DTS) is the predecessor to SSIS. The DTS component was a redistributable client component, and did not require a separate server license to be used from any client machine.

SSIS is not redistributable and this has been the case since it first came out with SQL Server 2005. You need a full server license on any machine on which you want to run a SSIS package. If you follow the reference link below, you will see the following

“Although installing BI Development Studio lets you design packages, running packages outside the development environment requires the installation of Integration Services. Subject to the terms of your SQL Server licensing agreement, you may be able to run Setup again to add Integration Services to your installation. Integration Services is now a server component and is not redistributable to client computers in the same manner that the components required for SQL Server 2000 Data Transformation Services (DTS) were redistributable.”

Remote Execution (Lack of)

There are a number of ways to execute a SSIS package (DTEXEC.EXE, DTEXECUI.EXE and SQL Server Agent). But SSIS does not have a remote execution feature, that would allow execution of a package from a remote machine, such as an application server with tools like timeXtender. We can quite simply conclude, that the machine that executes a SSIS package, needs to have SSIS installed.

SSIS and timeXtender

SSIS does not have to be installed on every machine running timeXtender. SSIS does need to be installed on every machine that needs to deploy and execute SSIS packages. SSIS needs to be installed on the following

  • The machine running the timeXtender scheduler
  • The machine running the timeXtender server (multiple environments)
  • The machine that reads the source schema (unless using ADO.NET)

References

One of the few places the license model has been documented by Microsoft, is in The Product Level Is Insufficient topic. This is the error message you will receive, if you attempt to run a SSIS package without a license.

Read Full Post »

In this post I will show you how to use prototyping to demonstrate business requirements and get rapid and actionable feedback. How you got the business requirements to demonstrate, is not important for the purpose of this post, but assuming that you have read some of my earlier blog posts, it should be clear that I recommend taking an agile approach.

The goals for the prototype

  • Demonstrate business requirements
  • Deliver fast, using minimum resources
  • Minimize rework

I am going to build a prototype using the dimensional model approach as per best practice. Although not strictly necessary for the purpose of a prototype, I am even using surrogate keys, as I will most likely reuse the prototype for the permanent implementation after one or more very short iterations.

The four easy steps

  • Build the data warehouse star schema
  • Enter sample data in the model
  • Build the dimensional model in OLAP
  • Build a test front end in Excel using Pivot Tables

Step 1 – Build the data warehouse star schema

In a dimensional model, the business requirements can be illustrated as this star schema (also known as a star join schema). For those unfamiliar with dimensional modeling, the yellow table is called a fact table. It is the center of a star schema and represents the numeric measurements of an event and has keys (links) to the dimensions that describe the event. The blue tables are called dimension tables. In this example we can describe an event: A sale and describe it in the context of customer, product sold and the point in time at which the event took place.

Remembering that the purpose is to demonstrate the business requirements as fast as possible and with a minimum of rework, I will implement the dimensional model above with manually entered sample data in the data warehouse database. When I build the dimensional model in the timeXtender meta layer it will look like this. Please note the red fields at the bottom of the fact table. These fields are lookup fields that are used to translate the natural key joins between my dimension and fact tables into surrogate based joins.

Step 2 – Enter sample data in the model

Using the custom data feature in timeXtender, I will enter just enough data to be able to demonstrate the model. As a side note, you would be amazed at the difference it makes to your business stakeholders, if you use recognizable data compared to just more or less random values.

At this point in time, the model including the manually entered demo data only exists in the timeXtender metal model in memory. If I use the deploy and execute features at this point, timeXtender wll generate all necessary SQL code in the physical data warehouse database, load it with demo data and save the project definition to the timeXtender project repository database. The processed fact table will look like this.

  • If you do not need a multidimensional representation for your data (an OLAP cube), the data are at this point already accessible from your front end of choice.

Step 3 – Build the dimensional model in OLAP (Analysis Services)

I will use a drag and drop approach to create a very simple OLAP representation of the star schema. This will include standard dimensions, standard measures and will look like this in timeXtender

Step 4 – Build a test front end in Excel

I will use the most common of all BI front end tools to build a view on the data that I have just built. A Pivot Table is an excellent way to get a feel of the dimensions and measures of a cube.

What is next?

The Excel workbook I just created can now be delivered to my business stakeholders. They can then use this to verify the business requirements and provide instant feedback. The business stakeholders might come back with extensions or modifications to the dimensional model after seeing the first result. This might lead me down one or more additional iterations of the above steps, adjusting the model according to more detailed requirements. This is all fine and exactly why prototyping is such an efficient way to interact between data warehouse developers and stakeholders. The time I spend at this early point in the process will save me from late and costly rework.

Once the model has been verified I can continue to identify data sources needed to load actual data into the dimensional model just built. I will most likely need to make minor changes to the dimensional model on the data warehouse, such as different data types for non system fields. Except for these minor changes, the entire dimensional model including the cube can be reused and put into production after proper testing with real life data.

 

Video link

My colleague Thomas Lørup Duun has made a video on this topic available on YouTube. The video is actually only 15 minutes in real time, so the header in this post is a bit conservative :-)

Read Full Post »

I will be the first to admit it, documentation is not fun. Talking to the business, understanding their requirements, building the dimensional model, developing the physical data warehouse and delivering the results to the business. That is both fun and valuable. Documentation is simply something I have to do.

I do it for the internal and external auditors. I do it for the next developer on the project. I do it for myself, because I cannot remember every little detail of everything I did, and unlike what some developers might say – not all code is self explaining. I do documentation because I have to, and because it is invaluable to the business and the continued development, expansion and enhancement of the data warehouse.

Since there is no doubt about the importance of doing the documentation, and since it is in no way fun, I can fortunately document an entire project in five minutes.

As you are most likely well aware of at this time, timeXtender is a meta data driven tool that generates code for SQL, SSIS and SSAS. Or to emphasize even more, timeXtender is the meta data thus just needs to put it in a document.

The elements of documentation

The automated PDF/XPS documentation that can be generated in timeXtender contains the following elements

  • Automatic meta data. Every field, table, data source, dimension, cube, measure etc. is automatically included in the documentation.
  • Custom documentation. Virtually every element within a timeXtender project can be described further using custom documentation in Rich Text format.

Add a custom description to an element

If you want to add what we at timeXtender call inline documentation, you simply

  1. Right click an element such as a table, field or measure
  2. Select the Description option in the context menu
  3. Add the description (which can be formatted using the simple editor or you can even paste in images from the clipboard)

Inline documentation is very powerful, fast and easy to add, simply just because you can describe/document an element on the element it self. There is no other tools involved. When working with a project you can also instantly see if documentation has been added to an element.

Custom Documentation

Generating the document

Creating the actual document is as easy as this

  1. Right click the project
  2. Select the Documentation option in the context menu
  3. Specify the document format, path and description settings
  4. Specify any optional settings such as colors and font
  5. Hit the OK button to generate the document

The documentation setup

 

The document will open once it has been created. The screen shot below shows a PDF formatted document. Please notice the attachment icon to the left. Any custom added descriptions will be attached inline to the document (depending on the setting in the above dialog).

 

Documentation PDF

Read Full Post »

%d bloggers like this: