SQL Server integration services
What is it?Microsoft’s ETL solution bundled with SQL ServerE – ExtractT – TransformL – LoadDestinationSourceWriteReadSSIS
DemoA simple ETL Demo
DemoYou can get a little crazy...
Why?
Confusion reigns!
Reasons Not Consider SSISPerformanceConsider SQL or BCP for simple imports
File system performanceData LatencySSIS is not a near real time solutionSOA, ESB, B2B IntegrationNo business rules support
Very basic queue support
XML support limitedReasons To Consider SSISMerging Data from Heterogeneous Data StoresPopulating Data WarehousesCleaning and Standardizing DataBuilding Business Intelligence into a Data Transformation ProcessAutomating Administrative Functions and Data Loading
What’s New in 2008Lookup transformation performance improvements and new caching optionsADO.NET Source and destination componentsData profiling task and viewerWizard interface for defining source and destinationScripts (for the Script Transform) are now done in Visual Studio and thus in .NET languages.New package formatThree new data formats for working with times
x64 LimitationsCommand line tools (dtexec, dtutil) cannot co-exist with 32bit versions.No DTS support.Limitations on data providers – No Access, Excel or SQL CompactIA64 has more limitations including no designer support
DemoRunning in 32bit on 64bit!

SQL Server Integration Services

Editor's Notes

  • #2 PREP:SQL Server BIDS – Demo Solution openSQL Server Management Studio with cleanup scriptZoomITMake sure the solution is in 64bit mode!
  • #4 Create a new packageAdd a data flow componentAdd a flat file connection – set it to the exercise.csv data. So suggested data times.Add a flat file source – bind to connectionAdd conditional splitLink to flat fileSplit on distance > 0Add sortLink to main split outputSort on dateAdd ADO.NET data destination connection manager – to spacedata.exerciseAdd ADO.NET DestinationLink to sort and connection managerDo mappingAdd Variable-Make sure scope is packageAdd Row countLink to conditional split elseLink to variableOn Control flowAdd SMTP taskSMTP connection to webmail.bbd.co.za and windows auth-Message body expression from file and change variableSave -> Run -> CrashChange distance to float on flat file connection, trickle changesSave -> Run -> Email
  • #5 Show completed solution data flow
  • #7 At a simple level moving data around sounds very easy, but when you break it down, it gets more and more complex and different needs requires different tools.SSIS is not a swiss army knife. It is a specialist tool.
  • #8 SSIS adds layers of support, logging etc... These all add a performance hit and you should not waste time using it for things where a quick BCP or even c# code would be better.SSIS is perfect for batch solutions, it is BAD for near real time solutions. There are tools built into SQL server (linked servers) and other tools (BizTalk) which handle real time well.SSIS is not a SOA/ESB/B2B tool it is a ETL tool.
  • #10 Lookup PerformanceThe ability to take rows that do not have matching entries in the reference dataset and load those rows into the cache.The ability to use separate data flows to load the reference dataset into the cache and to perform lookups on the reference dataset.Data profiling is used to profile the data in SQL server to see the quality/type of data. Useful for identifing improvements and fixing bugsThere is a tool to convert to the new format.
  • #12 Use management studio script to clean data outAdd Excel SourceDelete flat file sourceConnect Excel source to split inputOpen excel source, create new OLE DB connectionTrickle changesSave -> Run -> CrashProject -> Properties -> Debugging -> Run64bitRuntime -> FalseSave -> Run
  • #13 Record set info: http://blogs.conchango.com/jamiethomson/archive/2006/06/28/SSIS_3A00_-Comparing-performance-of-a-raw-file-against-a-recordset-destination.aspxMemory – Make sure you have enoughSELECT * - all that meta data and unused columns need processing!Small packages – You can call one package from another. Allows work to be broken up, which means a team can easily work on the solution, makes fault finding easier, lowers over headsComments – DUH!Understand the components is key to super usage – Many can be used to do the same thing. Lookup and Merge Join for instance can both be used to lookup data. Lookup has three modes which imposed performance vs. Memory trade offs where merge join does not. Merge join requires sorted input while lookups don’t. Execute SQL allows any SQL dialect while execute T-SQL allows only T-SQLBecause things can be put into parallel that doesn’t mean they execute in parallel. Some are async and some aren’t!