Monday, August 30, 2010

.NET FileSystemWatcher and dropping mutiple files

If you spend too much time in the event handler, FileSystemWatcher will miss events. This leads to lost business functionality. There are lot of questions around this topic (how to handle multiple file drops).

Even though you cannot spend too much time in the FileSystemWatcher event handlers, you have to wait until 'at least' file copy is complete (file created event will be fired as soon as the first byte is copied).

I design and develop quite a few server applications which have to be scalable and handle any amount of load. I solved the above mentioned problem as follows.

1. I have created two separate services. 'FileWatcher' and 'FileProcessor'.

FILE PROCESSOR SERVICE
This consists of two logical components. First one writes the full path of the dropped file to a message queue (MSMQ) called 'pre-copy-queue'. Note that there is absolutely no delay here. The other component in this service will pick up one item at time from this queue and waits until it is copied before it makes an entry in queue (MSMQ) called 'FileReadyToBeProcessed'. I am also processing mutiple files in parallel using SmartThreadPool.

FILE PROCESSOR SERVICE
This process picks up items from the queue (MSMQ) 'FileReadyToBeProcessed' and does whatever needs to be done. I am using SmartThreadPool to process multiple items in parallel.

If you are interested in code snippets, let me know.

Monday, July 26, 2010

VS 2010 and Database projects

I run a small company, that means we have to do things the 'smart' way.

I do not think many people are aware of cool things that are available in VS 2010 database projects.

I am so used to using 'refactor' feature in C# projects. Guess what, this feature is available for database projects now. For example, if rename a table, VS 2010 will automatically change all the stored procedures that uses the table.

Moving from .NET 3.5 to .NET 4.0

There are lot of the projects using Membership Providers (for authentication and authorization). The user data (user info, password, roles etc) was created using the code using .NET 3.5.

Recently I wanted to start using .NET 4.0. All of a sudden my users complained about not able to login. When I did the root cause analysis, I found out the default algorithm used for hashing passwords was changed from 'SHA1'.

In order to make things work, I had to explicitly specify 'SHA1'.

Sunday, May 23, 2010

Silverlight - Azure projects and generating proxy

1. Keep 'useRequestHeadersForMetadataAddress' section in your WCF role's web.config

2. Run one instance of VS 2010 with all the roles and start debugging.

3. While that application instance is running, open another instance of VS 2010 and goto your silverlight project, right click on reference to update the proxy references.


This has worked out very great for me. I thought I will share this to other people who are also working on similar projects.

Saturday, May 15, 2010

Installing SharePoint 2010 RTM on windows 7

1. Make a folder called 'SharePointFiles' on your C: drive (C:\SharePointFiles)
2. Copy the contents of your media (CD or DVD) to this folder
3. Open the file C:\SharePointFiles\Files\Setup\Config.config (It is an xml file)
4. Add the following line

Setting Id="AllowWindowsClientInstall" Value="True"

5. Save this file.
6. Run setup.exe

Saturday, May 08, 2010

VS 2010 and Azure local storage utility DSInit

Looks like, by default it is configured to use SQL Server Express. You have to use DSInit utility to let it use your SQL Server 2008 Standard/Enterprise edition.

You will see the following run time error if SQL Server Express is not installed.

Windows Azure Tools: Failed to initialize the Development Storage service. Unable to start Development Storage. Failed to start Development Storage: the SQL Server instance ‘localhost\SQLExpress’ could not be found. Please configure the SQL Server instance for Development Storage using the ‘DSInit’ utility in the Windows Azure SDK.

Tuesday, April 27, 2010

BI SQL Server 2008 and NTILE

BI is all about converting data into 'information' . Even after this effort, you need to filter out noise from information before it is presented to the end user.

For example, every financial BI solution will have some sort of 'Balance' amount. What this means is, this money is owed to you, but not paid yet.

Instead of showing every row in your balance (fact) table, you can categorize the content coming from this row as 'Low', 'Medium' and 'High'. And when higher level people (decision makers) are looking at the information, I will just show the information that belong to 'High' category.

How do I do this at the data access layer?

SQL Server 2008 has NTILE functionality.

SELECT service_date, balance_amount,
CASE NTILE(3) OVER(ORDER BY balance_amount, service_date)
WHEN 1 THEN 'low'
WHEN 2 THEN 'medium'
WHEN 3 THEN 'high'
END AS lvl
FROM fact_balance
ORDER BY balance_amount, service_date;

Saturday, April 24, 2010

TFS 2010 RTM installation and Reporting services Error


When you try to install TFS 2010 RTM, you may get the following error.


You need to edit rsreportserver.config file located at
C:\Program Files\Microsoft SQL Server\MSRS10_50.MSSQLSERVER\Reporting Services\ReportServer and change the value of 'SecureConnectionLevel' to 0 (zero)


VS 2010 RTM and Silverlight 4


I am not sure what is causing this. When I wanted to change silverlight target version from 3 to 4, I was getting the following error:

In order to fix this problem, you need to change registry entry from V3 to V4 at

x86

HKEY_LOCAL_MACHINE\SOFTWARE\Microsoft\VisualStudio\10.0\DesignerPlatforms\Silverlight

x64

HKLM\Software\Wow6432Node\Microsoft\VisualStudio\10.0\DesignerPlatforms\Silverlight



Azure and .NET 4.0

After downloading latest VS 2010, I thought I will make my azure projects to target .NET 4.0
Looks like it is not supported yet.

I see the following compiler error:

Cloud Service projects currently support Roles that run on .NET Framework version 3.5.
Please set the Target Framework property in the project settings for this Role to .NET Framework 3.5

Tuesday, April 20, 2010

Data warehouse: Populating fact table with multiple date keys

In any practical data warehouse, there will be always few fact tables which will have more than one date key (order date, shipped date for example).

There are many ways by which people populate such facts. Here is what did today in one of my projects:

I am using Kimball's spread sheet to populate the date dimension. So, surrogate keys are kind of 'smart' (I know , I know you don't want surrogate keys to be smart ...)
For example key for '12/5/2009' will be '20091205'.

Assuming this is the scheme you are using for date dimension surrogate keys, you can use the following script to load the fact table

ORDER_DATE_KEY = RTRIM(LTRIM(STR(DATEPART(yyyy,@myDate)) +
RTRIM(CASE WHEN Month(@myDate) <>
THEN '0' + CONVERT(Char,DATEPART(Month,@myDate))
ELSE CONVERT(Char,DATEPART(Month,@myDate)) END) +
RTRIM(CASE WHEN Day(@myDate) <>
THEN '0' + CONVERT(Char,DATEPART(dd,@myDate))
ELSE CONVERT(Char,DATEPART(dd,@myDate)) END) )),
SHIP_DATE_KEY = RTRIM(LTRIM(STR(DATEPART(yyyy,@myDate2)) +
RTRIM(CASE WHEN Month(@myDate2) <>
THEN '0' + CONVERT(Char,DATEPART(Month,@myDate2))
ELSE CONVERT(Char,DATEPART(Month,@myDate2)) END) +
RTRIM(CASE WHEN Day(@myDate2) <>
THEN '0' + CONVERT(Char,DATEPART(dd,@myDate2))
ELSE CONVERT(Char,DATEPART(dd,@myDate2)) END) ))


















Sunday, March 21, 2010

Snow picture


It snowed here this winter. Had an opportunity to take few good pictures.


Adding certificate to your Azure Service

  • The following are the steps you need to take to enable SSL
  • 1. Create the certificate request. I did this with IIS 7 Manager on windows 2008. For Common name I used my company name (instead of myapp.cloudapp.net).
  • 2. I used godaddy (there are several places to get this certificate) to get my certificate. You have to upload the request created in step #1 (It is just a text file).
  • 3. If you are the owner of the common name used in step #1, you will get an email to approve the certificate request. You approve it.
  • 4. Go back to godaddy site, download the certificate (in my case it was .CRT file)
  • 5. For azure, we need .PFX file. In order to create a pfx file from the certificate we just obtained from godaddy, first we need to install this certificate in IIS 7.0. I did this my double clicking on the downloaded certificate and installed it in my personal store. Exported to a pfx (right click and select export).
  • 6. In the development environment (VS 2008/VS 2010), selected the appropriate role and added this certificate.
  • 7. Uploaded the PFX file to my azure account.

Saturday, February 13, 2010

VS 2010 RC installation

Installed VS 2010 RC successfully. List of things to keep in mind while installing

1. Uninstall VS 2010 beta 2
2. Uninstall .NET 4.0 (there are two items related to this that you have to uninstall)
3. Any VS 2010 Office development components

If you are developing azure applications

1. Uninstall Azure SDK
2. Uninstall VS 2010 and VS 2008 tools for Azure

----------------------------------------------------------
a] Install VS 2010 RC
b] Install VS2010 RC TFS Client

c] Install Azure SDK (1.1)
d] Install Feb 2010 release of Azure tools (not NOV 2009)

VS 2010 RC and SilverLight Client Proxy with collections

I have this application. Some of the data types in server application are collections of type ObservableCollection.

Example:
public class VisitCounts : ObservableCollection[visitcount]
{
}

In VS 2008 silverlight client application generates a proxy where I can still reference ObservableCollection in e.Result (of async calls).

For whatever reason, VS 2010 RC is not able to generate proxy with these special collection types (even after tweaking the advanced options).

Friday, February 05, 2010

SSIS and MERGE statement

I encountered an error while working on a DW/ETL project.

If you are using dynamic connection manager settings in your SSIS packages, you some how set the connection string as follows

Provider=SQLOLEDB.1;Initial Catalog=MYDB;Data Source=(local);User ID=myuser;Password=myPassword;

Well this works just fine as long as you are NOT using any of the SQL Server 2008 new features.

In my case, I had to use MERGE statement and above connection string did not work. After spending couple of hours, I found out the issue. You have to make sure that, new provider will be used in your connection string. The connection string should look like this.


Provider=SQLNCLI10;Initial Catalog=MYDB;Data Source=(local);User ID=myuser;Password=myPassword;

Hope this helps....

Followers