Wednesday, December 23, 2009

SSIS: How to fix 'Can't get lock on variable or deadlock occured' error when you are trying to handle exception using Script Component

Issue:

If you've a Script Component in the Control Flow, which is writing to LogFile (DtsVariable), and an exception occurs, and if you Exception Handling Component also using the same LogFile (DtsVariable) to log error to the file then you enter a deadlock scenario. It happened because the Script component was still holding the lock but flow jumped to the Exception Handler. All that needs to be done was to un lock variable before we jump to the Exception Handler Block.

Resolution:

Try

...

Catch ex As Exception
Logfile.WriteLine("Exception Occured: " + ex.Message)
Dts.Variables.Unlock()
Dts.TaskResult = Dts.Results.Failure
Throw New Exception(ex.Message)
End Try

Additional Reading:

Reusable Functions for Reading and Writing Variables in SSIS Script Tasks

http://www.developerdotstar.com/community/node/512

Thursday, December 3, 2009

Partitioned Tables and Indexes in SQL Server 2005

Partitioned Tables and Indexes in SQL Server 2005
reference: http://msdn.microsoft.com/en-us/library/ms345146(SQL.90).aspx

This article is simple abstract of what has to be done. If you've a scenario wherein you need to partition a large table Orders based on for instance order date (datetime datatype). We can either partitioned the table for each month, Quarter or Half yearly depending on what is expected requirement.
Step1
-- add filegroup to the database
-- first filegroup
ALTER DATABASE TestPart ADD FILEGROUP [2007HY1]
-- second filegroup
ALTER DATABASE TestPart ADD FILEGROUP [2007HY2]

Step 2
-- add files to the filegroup
-- add first file to the file group
ALTER DATABASE TestPart
ADD FILE
(NAME = N'2007HY1',
FILENAME = N'C:\TestPart\2007HY1.ndf',
SIZE = 5 MB,
FILEGROWTH = 20%)
TO FILEGROUP [2007HY1]
-- add second file to the file group
ALTER DATABASE TestPart
ADD FILE
(NAME = N'2007HY2',
FILENAME = N'C:\TestPart\2007HY2.ndf',
SIZE = 5 MB,
FILEGROWTH = 20%)
TO FILEGROUP [2007HY2]
Step 3
-- Partition Function
CREATE PARTITION FUNCTION OrderDateRangePFN_HY (datetime)
AS
RANGE LEFT FOR VALUES ('20070630 23:59:59:997',
'20071231 23:59:59:997')

Step 4
-- Partition Scheme
CREATE PARTITION SCHEME OrderDatePScheme_HY
AS
PARTITION OrderDateRangePFN_HY
TO ([2007HY1], [2007HY2], [PRIMARY])

Step 5
-- Create Partitioned Table
CREATE TABLE OrderRange_HY
(
OrderID int NOT NULL,
OrderDate datetime NOT NULL
)
ON OrderDatePScheme_HY (OrderDate)
GO

-- Converting Unpartitioned table to Partitioned Table
CREATE INDEX OrderPart_INX
ON OrderRange_HY_Existing (OrderDate) -- this is an existing table
ON OrderDatePScheme_HY (OrderDate)
GO

Using Partitions in a Microsoft SQL Server 2000 Data Warehouse: http://msdn.microsoft.com/en-us/library/aa902650(SQL.80).aspx

Tuesday, October 13, 2009

Detecting High Memory consuming functions in .NET code

This post is just an pointer to a useful post by Shivprasad Koirala, which can be accessed from the link below
About the Tool
http://www.dotnetfunda.com/articles/article508-net-best-practice-no-1-detecting-high-memory-consuming-functions-in-net-code.aspx#
Download
CLR profiler is a tool which helps to detect how memory allocation happens in .NET code. CLR profiler tool is provided by Microsoft, you can download the same from http://www.microsoft.com/downloads/details.aspx?familyid=A362781C-3870-43BE-8926-862B40AA0CD0&displaylang=en .

Wednesday, October 7, 2009

Batch files - Count the command line argument or pass them to the executable

An interesting post that might help us http://stackoverflow.com/questions/1291941/batch-files-number-of-command-line-arguments, if the user is not user how many argument will be passed to the executable. This code might help
#1 Sample
set argC=0
for %%x in (%*) do Set /A argC+=1
echo %argC%

#2 Sample, more robust
@echo off
setlocal enableextensions enabledelayedexpansion
call :getargc argc %*
echo Count is %argc%
echo Args are %*
endlocal
goto :eof
:getargc
set getargc_v0=%1
set /a "%getargc_v0% = 0"
:getargc_l0
if not x%2x==xx (
shift
set /a "%getargc_v0% = %getargc_v0% + 1"
goto :getargc_l0
)
set getargc_v0=
goto :eof
Output
C:\> ArgList.cmd 1 2 3 4 5 6 7 8 9 10 11
Count is 11
Args are 1 2 3 4 5 6 7 8 9 10 11

Why this article is written: Windows Batch files usually support 0-9 arguments.

Thursday, October 1, 2009

SSIS error DTS_E_CANNOTACQUIRECONNECTIONFROMCONNECTIONMANAGER when connecting to Oracle data source

Thanks to the useful post Jorge, here is what we might look for to resolve the issue:-
1. Check the Provider you are using Microsoft or Oracle
2. Check the links below in order to resolve the issue
Useful links
64-bit Considerations for Integration Services
http://msdn.microsoft.com/en-us/library/ms141766.aspx
SSIS error DTS_E_CANNOTACQUIRECONNECTIONFROMCONNECTIONMANAGER when connecting to Oracle data source
http://blogs.msdn.com/jorgepc/archive/2008/02/12/ssis-error-dts-e-cannotacquireconnectionfromconnectionmanager-when-connecting-to-oracle-data-source.aspx
Oracle Drivers on an x64 Box for SSIS Development
http://www.artisconsulting.com/blogs/greggalloway/Lists/Posts/Post.aspx?ID=11

Monday, September 7, 2009

SQL SERVER 2005: CONCAT_NULL_YIELDS_NULL Demystified

What wondering for hours to figure out what might be the issue, here is what happened. Might help if someone is stuck with a similar issue.
For instance,
declare @TestString varchar(10)
--select @TestString = ''
select @TestString = COALESCE(@TestString + '/ ' + 'a', 'b', '')
select @TestString
OUTPUT:
(When SET CONCAT_NULL_YIELDS_NULL ON)
b
(When SET CONCAT_NULL_YIELDS_NULL OFF)
/a

so, if you are using something like this in your stored procedure, check the following option
SET CONCAT_NULL_YIELDS_NULL ON OFF

Tuesday, August 25, 2009

SSRS: Export report to the Excel 2003 file



Using the SQL Server 2005 SSRS you can format the reports and use the RDL file in your application to export it to an EXCEL 2003 file format
Note: Excel 2007 is not supported and
The deployment machine should have "Microsoft.ReportViewer.WinForms" in the GAC.

Sample Code:
//Namespace
using System;
using System.Data;
using System.Text;
using System.Windows.Forms;
using System.IO;
using System.Collections.Generic;
using System.Drawing.Imaging;
using System.Drawing.Printing;
using Microsoft.Reporting.WinForms;
//Namespace
// Function to get the datatable
private static DataTable LoadSalesData()
{
// code to get the Data Table
}
// Main code to render report to EXCEL 2003 file
static void Main(string[] args)
{
Warning[] warnings;
string[] streamids;
string mimeType;
string encoding;
string extension;
ReportViewer reportViewer = new ReportViewer();
// Set Processing Mode
reportViewer.ProcessingMode = ProcessingMode.Local;
// Set RDL file
reportViewer.LocalReport.ReportPath = "MyReport.rdl"; //Keep the RDL file in the Bin Folder
// Supply a DataTable corresponding to each report data source
reportViewer.LocalReport.DataSources.Add(
new ReportDataSource("MyDataset", Program.LoadSalesData()));
// The Above Dataset name should be same as in the RDL file
// Process and render the report
reportViewer.RefreshReport();
// Render to Excel
byte[] bytes = reportViewer.LocalReport.Render("Excel", null, out mimeType, out encoding, out extension, out streamids, out warnings);
FileStream fs = new FileStream(@"c:\outputReport.xls", FileMode.Create);
fs.Write(bytes, 0, bytes.Length);
fs.Close();
}

Deployment in the production Box
Deploying Reports and ReportViewer Controls
You can freely distribute reports and the ReportViewer controls as part of your application. Deployment requirements vary widely depending on the control type and whether your report is configured for local or remote processing. You can deploy locally processed and remotely processed reports in the same application.
Redistributing the ReportViewer Controls
The ReportViewer control redistributable is a self-extracting file named ReportViewer.exe that includes an .msi and other files. You can find ReportViewer.exe at the following location: C:\Program Files\Microsoft Visual Studio 8\SDK\v2.0\BootStrapper\Packages\ReportViewer\ReportViewer.exe.
When you run ReportViewer.exe, the following files are copied to the Global Assembly Cache folder on the deployment computer.
File
Microsoft.ReportViewer.Common
Microsoft.ReportViewer.ProcessingObjectModel
Microsoft.ReportViewer.WebForms
Microsoft.ReportViewer.WinForms
for more information visit http://msdn.microsoft.com/en-us/library/ms251723(VS.80).aspx

Issue:

While formatting data in the html, we can use CanGrow and CanShrink to adjust the Cell width and Height, but while exporting/rendering to Excel, all the cells in the Excel sheet having Text Wrap Enabled. There is no way to resolve this, except using Interop Services of Excel, but it requires Excel to be installed on the Production Box.

Resolution:

The issue occured when i select a single column and made CanGrow = False and CanShrink = True. If you select all the columns and set these values then the issue is resolved. ;-)

SQL SERVER 2005: Use XACT ABORT to roll back non trapable error transactions

The Usual way to code for transaction is
BEGIN TRANSACTION;
BEGIN TRY
--What if you are calling a function which is missing in the DB
--What if you are reference a table in another database without synonym
--Under some of these above issues the transaction will not roll back
COMMIT TRANSACTION;
END TRY
BEGIN CATCH
SELECT @Return = ERROR_NUMBER()
SELECT ERROR_SEVERITY() AS ErrorSeverity,
ERROR_STATE() as ErrorState,
ERROR_LINE () as ErrorLine,
ERROR_PROCEDURE() as ErrorProcedure,
ERROR_MESSAGE() as ErrorMessage;
IF @@TRANCOUNT > 0
ROLLBACK TRANSACTION;

RETURN @Return
END CATCH

So in order to catch non trapable error transactions use SET XACT_ABORT ON it traps all the errors and rolls back the Transaction

Another good link to check http://wiki.lessthandot.com/index.php/Use_XACT_ABORT_to_roll_back_non_trapable_error_transactions

SQL SERVER 2005: Removing Duplicate records

The following post tell you how to remove duplicate records from a table not having primary key
CREATE TABLE T1
(
[Ename] char(10) NULL,
[ECity] char(10) NULL
)
GO
INSERT INTO T1 (Ename, ECity)
SELECT 'Hamed', 'Bellevue'
UNION ALL
SELECT 'Billg', 'Redmond'
UNION ALL
SELECT 'Billg', Redmond
Now that duplicate records exists...
; WITH T1CTE(Ename, ECity, Ranking) AS
(
SELECT Ename, ECity, Ranking = DENSE_RANK() OVER (PARTITION BY Ename, ECity ORDER by NEWID() ASC)
FROM T1
)
DELETE FROM T1CTE WHERE Ranking > 1
-- Duplicates from the underlying table T1 is removed when we process the Virtual table T1CTE
-- To verify run the following query
SELECT * FROM T1;

Another good reference: http://blogs.techrepublic.com.com/datacenter/?p=420

Tuesday, July 28, 2009

SSIS: Common Issues while using Bulk Insert Task

If you are trying to bulk load the following data from a text file. (1st Row is the Column Header)
ID,Ename,City
1,Hamed,Hyderabad
2,Bill,Medina
3,Steve,Redmond
Issue:
Bulk load data conversion error (type mismatch or invalid character for the specified codepage) for row 1, column 2
Resolution:
Make Sure in the Bulk Insert Task Editor under "Options" tab the First Row is 2, 1 is the default. This will avoid loading of Column headers

Other issues might be with the Text Qualifer or the Source file is a Unicode. For first case, make sure the Flat File Connection Manager takes care of it for the other issue, do a conversion to DT_STR or make the destination table columns as nchar (if char) or nvarchar (if varchar)

Wednesday, July 8, 2009

SSIS: Handling different row types in the same file
















Problem Description:
Suppose you have a file row with data something like this...
E001 Steve Bellevue
C002 Bill Medina
Where Index 0 is the Type of Record where C = Chairman or E = CEO
Where 1 to 3 is the employee ID
Where 5 to 9 is the Name of the employee
Where 10 to 20 is the Address
And if there is requirement to distribute the data based on the Record Type then use the following Solution...
Solution:
Thanks for the Post by Allan Mitchell, the following solution helps to resolve the problem described above.
Sometimes source systems like to send us text files that are a little out of the ordinary. More and more frequently we are seeing people being sent master and detail or parent and child rows in the same file. Handling this in DTS is painful. In SSIS that changes. In this article we'll take you through one file of this type and provide a solution which is more elegant than what we have today, and hopefully will give you ideas as to how you can handle your own versions of this type of file. As mentioned this is a simple solution but we could extend it to include more powerful processing techniques like performing lookups to see if we already have in our destination any of the rows coming through the pipeline etc. This is how our package should look when everything is finished.
The graphic below shows us the text file sent to us by our source system. As you can see the last column is the type of row so a "M" indicates a master row and a "D" indicates a detail row. In this example there is one master row and three detail rows.
We create a flat file connection manager that points to this file. We are not going to split the lines in the text file up into their respective columns yet because we have two types of rows and they have varying counts of columns. Instead we are going to read in the line as one string. This is how we set up the connection manager to do that.
Once we've done that we add a DataFlow task to our package and in the dataflow behind the task we add a Flat File Source Adapter which points to our earlier connection manager. The only thing we're going to change is the name of the ouput column, as shown here.
After doing that we are going to flow the output into a Conditional Split Transform. We will use this transform to interpret the line type through the use of the final column in the text file and direct the row down the correct output. With the Conditional Split transform in our example we will have three possible outputs flowing downstream. The first two are based on whether we have decided the row is a Master row or a Detail row and the third is one that we get for free and this is where all other rows that do not meet our criteria will flow.
So we've split up our rows and assigned them to the correct outputs we now need to hook them up to a Script Transform to break the line apart and create our real columns from the one line. When you drag an output from the Conditional Split Transform to the Script Transform you will be asked which output you would like to attach. In the Script Transform the first thing we are going to do is add some columns to the Output. When the data has flowed through this component, downstream components will see these columns and it will be easier for the package designer to assign them to their correct destinations. Here's how we do that.
We are only going to look at the script transform that handles the rows flowing down the master row output, but the code used can easily be adapted to also handle the rows flowing through the detail row output. We now have one column coming into this transform and we have a number of columns we have just created ready to take data from this column so let's write some script to do that.
All this code is doing is taking the column that comes in (TheWholeLineColumn) and splitting it up based on a comma delimiter. We then assign elements in the created array out to our ouput columns. All that is left now is to hook up our output from the script transform to a destination and for ease of use we have simply pointed the output to a Raw File Destination Adapter. We have added a dataviewer to the flow of data going into the Script transform so we can see the definition of the row going in. You can see this in the graphic below.
Finally, all that is left for us to do is run the package and watch the rows flow down the pipeline. What we should see is that one row flows to the master output and three rows flow down to the details output

SSIS: Choosing a Configuration Approach for Deployment

The following are some of the sample approaches, these are not the de facto...

Recommendations for Specific Scenarios

1. If the deployment environment supports environment variables and has a SQL Server database available. If a SQL Server database is not available, then XML files with indirect configurations (based on environment variables) can be used in a very similar way.

2. If the deployment environment does not allow environment variables, but a consistent path can be provided in each environment, then XML configurations can be used. The path should be identical in each environment (for example, “C:\SSIS_Config\”), so that a local copy of the configuration file can be deployed to each environment without having to modify the path to point to a new configuration location.

3. If environment variables are not allowed, and a consistent local path cannot be used, the Parent Package Variable approach is often the best approach, as it isolates any environmental changes to the parent package.

Monday, April 27, 2009

VS ASP.NET Error: it is an error to use a section registered as allowDefinition='MachineToApplication' beyond application level.

Error
Error while compiling solution in Visual Studio .NET: It is an error to use a section registered as allowDefinition='MachineToApplication' beyond application level. This error can be caused by a virtual directory not being configured as an application in IIS

Resolution
If there are any web.config files in the sub folders that make sure you don't duplicate the entries under of web.config in the root folder, in the web.config in the sub-folder.

For instance if the folder structure of your ASP.NET application is like this...
Login.aspx
Web.config
(In web.config
system.web
authentication mode="Windows"
system.web
)
Secure (folder)
Home.aspx
Web.config
(In subfolder web.config I Add the same I get error
system.web
authentication mode="Windows"
system.web
)
(In subfolder web.config I remove those entries, error goes off
system.web
!-- comment it
authentication mode="Windows"
system.web
)

Tuesday, April 21, 2009

How to deep copy an object in .NET

Introduction
This article gives a quick look at how to deep copy your objects. The idea is not to give an insight on why cloning is required and which is the best method to do it, as there are numerous articles on cloning, this article gives the implementation of deep copying in .NET
Implementation
In a shallow copy the value types are copied, whereas the reference types refer to the same shared memory. With a deep copy the memory references by the object is also copied (cloned) and each object references its own instance of data. There are different ways of deep copying an object, one method is to use memory stream, where in object is serialized into a memory stream and then we de-serialize it into a new object. Another approach is to use the Reflections concepts of .NET, article on which can be read in Code Project at this link
http://www.codeproject.com/KB/cs/cloneimpl_class.aspx .
So in this article the approach would be to use the first one and understand the implementation and complexities involved. Please note that this approach requires us to serialize the object that needs to be cloned.
Complexities are involved where in you might need to include or exclude some fields of the object, which can be achieved using the
Conclusion
This article brings together the different approaches, to deep copy a .NET object, for reference. We also get to look at what happens in the default copy wherein Member wise copy is done.

Code
///
/// Abstract class that implement Deep Cloning using Memory stream
///

[Serializable]
public abstract class MyBaseObject : ICloneable
{
public object Clone()
{
// deep-copying

MemoryStream memoryStream = new MemoryStream();

BinaryFormatter binaryFormatter = new BinaryFormatter();

binaryFormatter.Serialize(memoryStream, this);

memoryStream.Position = 0;

object clone = binaryFormatter.Deserialize(memoryStream);

memoryStream.Close();

return clone;
}
}

///
/// this class is reference from the class that needs to be cloned
///

[Serializable]
public class MyReferenceClass : MyBaseObject
{
///
/// Some Id
///

private int _id;


}

///
/// Class that needs to be cloned
///

[Serializable]
public class MyCloneableClass : MyBaseObject
{
///
/// Class to be cloned having a reference Type
///

public MyReferenceClass _myReferenceClass;

///
/// Class to be cloned having a value type
///

private string _name;

///
/// Constructor
///

///
public MyCloneableClass(int id)
{
_myReferenceClass = new MyReferenceClass();
_myReferenceClass.ID = id;
}

}
This article brings together the different approaches, to deep copy a .NET object, for reference. We also get to look at what happens in the default copy wherein Member wise copy is done.

Thursday, February 12, 2009

WebPart: Register tag is added automatically to .ASPX file when you add PPS web part

This observation can be used in other context.
Suppose you've have a .ASPX page in SharePoint 2007 that has Performance Point Dashboard web part. To uses the SharePoint site on a client machine that does not have PPS, remove
%@ Register TagPrefix="WpNS1" Namespace ="Microsoft.PerformancePoint.Scorecards.WebParts" ....

If Client machine has PPS installed, and when you add the PPS web part to your page this entry is automatically added.

Tuesday, February 10, 2009

IIS 7: Resolving Anonymous Logon to SQL from ASP.NET Application deployed on IIS7

If you face an issue wherein your ASP.NET application deployed in IIS7 connects to SQL (SQL Server 2008 Analysis Server) as anonymous user then do the following:-
1. Create a new application pool "MyPool" with Managed Pipeline mode set to classic, Load userprofile set to False and Identity set to custom and using a domain user (and password) that has access to the database.
2. Under 'authorization' of the website itself (this website in now under "MyPool" application pool), have Anonymous set to enabled and ASP.NET impersonate set to disabled.
3. In Web.config, leave impersonation="false".
http://mvolo.com/blogs/serverside/archive/2007/12/08/IIS-7.0-Breaking-Changes-ASP.NET-2.0-applications-Integrated-mode.aspx

Thursday, February 5, 2009

SharePoint: Sharepoint connects as Anonymous User to SQL Analysis Server on a Remote Machine and fails to retrieve data from CUBE

If you are trying to connect to a Remote Analysis Server 2008 (64 Bit) it connects as Anonymous User for one of the reasons.

1) Make sure the Authorization is Windows
2) Try this
In web.config Set
identity impersonate=" false"
When it is “true”
For User's IE , User a/c has to connect/open SQL server It requires kerbero’s website (setting that you do while setup of WebSite:Portnumber)
When it is “false”
For User's IE, website Identity Account has to connect/open SQL server
It does not require kerbero’s website

Monday, February 2, 2009

.NET: Reading Property Values using Reflection

class CA
{
private string _myProperty;
public string MyProperty
{
get
{
MyProperty = "Hello"; return MyProperty;
}
}
}


class ReadPropertyGeneric where T : class
{
Type myType;
T _objClass;
public void ReadAndSetProperty(T objClass)
{
myType = (typeof(T));
_objClass = objClass;
// Get the public properties.
PropertyInfo[] myPropertyInfo = myType.GetProperties(BindingFlags.Public BindingFlags.Instance);
Console.WriteLine("The mumber of public properties is {0}.", myPropertyInfo.Length);
// Display the public properties.
DisplayPropertyInfo(myPropertyInfo);
}
public void DisplayPropertyInfo(PropertyInfo[] myPropertyInfo)
{
// Display information for all properties.
for (int i = 0; i < myPropertyInfo.Length; i++)
{
PropertyInfo myPropInfo = (PropertyInfo)myPropertyInfo[i];
Console.WriteLine("The property name is {0}.", myPropInfo.Name);
Console.WriteLine("The property type is {0}.", myPropInfo.PropertyType);
Console.WriteLine("The property value is {0}.", myPropInfo.GetValue(_objClass, null));
object setValueString = myPropInfo.GetValue(_objClass, null) + "NewValue";
myPropInfo.SetValue(_objClass, setValueString, new object[0]);
}
}

Tuesday, January 20, 2009

SharePoint: Redirect Web Part to another Page

If you want to redirect from the Document Library's AllItems.aspx page to another page on "OnLoad Event", Create a Re direct Web Part which would re-direct to the desired page and add it to AllItems.aspx page

MSDN: U.S. Local Highlights