Thursday 29 November 2012

Checking in from VS2008 to TFS2010

In order to check in from VS2008 to TFS2010, you need to install

Team Foundation Server Power Tools   October 2008
Power Tools that extend the Team Foundation Server integration with Visual Studio.

This enables custom check-in policies.

Context sensitive help in Endur

To get context sensitive help in Endur,

locate the help file (e.g. C:\OpenLink\helpdir\olf.chm).

Edit your start configuration, setting the environment variable:

SET AB_HELP_FILE=C:\OpenLink\helpdir\olf.chm

To set the variable immediately, go into System Wide Configuration.
Enter Help into the Search box. Click on AB_HELP_FILE.
In the right hand pane, select the username and enter the User Value (e.g. C:\OpenLink\helpdir\olf.chm).

Click Save.

Thursday 22 November 2012

T-SQL: Selecting the top n rows from a Group By


I have the following data


deal_tracking_num
tran_num
type_id
value
personnel_id
last_update
897693
897693
20065
Manual
20015
2012-11-22 10:59:38.240
897693
897693
20065
Non-STP
20015
2012-11-22 11:09:09.190
897695
897695
20065
Manual
20015
2012-11-22 10:58:58.063
897695
897702
20065
Manual
20015
2012-11-22 10:59:23.493
897698
897698
20065
STP
20015
2012-11-22 10:16:23.903
897698
897698
20065
eCM
20015
2012-11-22 10:18:38.587
897698
897701
20065
eCM
20015
2012-11-22 10:21:07.350
897698
897703
20065
eCM
20015
2012-11-22 11:22:25.377

To select the first row for each deal_tracking_num, use the following query:

select a.deal_tracking_num, a.value from 
(
     SELECT ab.deal_tracking_num, ati.*, ROW_NUMBER() over (partition by deal_tracking_num ORDER BY ab.deal_tracking_num, ati.last_update) as rank
     FROM ab_tran_info_history ati JOIN ab_tran ab ON ab.tran_num = ati.tran_num
     WHERE ati.type_id = 20065 -- Trade Confirmation Type
) a
where RANK = 1

This returns

897693
Manual
897695
Manual
897698
STP

Wednesday 21 November 2012

Log4Net - writing to event log with an impersonated user

If an ASP.NET application impersonates a user then log4net may fail to write to the event log.

One way around this is to permission the Event Log so that all Authenticated Users can write to it.

This is described here:
http://support.microsoft.com/kb/2028427

and
http://jpadda.wordpress.com/2010/08/08/event-log-write-permissions/



1.       Determine the Primary Domain Controller by executing the following command in an elevated command prompt:

NETDOM QUERY /D:<mydomain>PDC

2.       Create a VBS script with the following code:

 

strComputer = "localhost"

Set objWMIService = GetObject("winmgmts:\\" & strComputer & "\root\cimv2")

 

Set objAccount = objWMIService.Get _

    ("Win32_UserAccount.Name='<username>',Domain='<mydomain>'")

Wscript.Echo objAccount.SID

 

3.       Run the code in an elevated command prompt and obtain the user’s SSID:

cscript <code.vbs>

4.  Dump the existing Application Event Log security to a file
wevtutil gl application > C:\temp\security.txt

5.       Grant authenticated users Edit the line containing the channelAccess string, appending the following (A;;0×3;;;AU).

6.       Apply the updated setting, e.g.

wevtutil sl Application /ca:O:BAG:SYD:(A;;0xf0007;;;SY)(A;;0×7;;;BA)(A;;0×7;;;SO)(A;;0×3;;;IU)(A;;0×3;;;SU)(A;;0×3;;;S-1-5-3)(A;;0×3;;;S-1-5-33)(A;;0×1;;;S-1-5-32-573)(A;;0×3;;;AU)

Tuesday 20 November 2012

Copying data from one database to another

Using SSIS is a good way of bulk copying data from one database to another.

You can use a wizard within SQL Server Management Studio to create the SSIS package. Right click on the server and click Tasks > Export Data. At the end you have the option to create an SSIS package and either save it directly into a SSIS server or as a file.

There are many ways to run the SSIS package. For developers it is probably simpler to create a SSIS project and add the created package into it. You can then run the package from within Visual Studio.

Thursday 15 November 2012

Cleaning up the WinSXS folder on Windows 2008 R2

This article describes how to clean up the WinSXS folder on Windows 2008 Server.

Basically, you open an elevated command prompt and enter
DISM.exe /online /Cleanup-Image /spsuperseded

Tuesday 13 November 2012

.NET 4.0 / 4.5 serialization - System.Runtime.Serialization.InvalidDataContractException: No set method for property x in type y

Last night I observed a difference in the handling of WCF service contracts between .NET 4.0 and 4.5.

The service has an interface that allows inherited types to be passed over it.

[ServiceKnownType("GetKnownProductTypes", typeof(KnownTypeHelper))]
public interface IDealBookingService
{
  [OperationContract]
  int SaveProduct(ProductBase product);
}
 
One of these derived types, CommStorProduct, had a property that did not have a set accessor:

[DataMember]
public DateTime StartDate3
{
get { return StartDate; }
}

Running this on a developer’s machine, no problems were encountered.
However, deploying this to a downstream environment, the following error was thrown when the service started:

Unhandled Exception: System.Runtime.Serialization.InvalidDataContractException:

No set method for property 'StartDate3' in type 'ServiceContract.MyDerivedType'.

   at System.Runtime.Serialization.DataContract.DataContractCriticalHelper.Throw

InvalidDataContractException(String message, Type type)

   at System.Runtime.Serialization.ClassDataContract.ClassDataContractCriticalHe

lper.ImportDataMembers()

   at System.Runtime.Serialization.ClassDataContract.ClassDataContractCriticalHe

lper..ctor(Type type)

   at System.Runtime.Serialization.DataContract.DataContractCriticalHelper.Creat

eDataContract(Int32 id, RuntimeTypeHandle typeHandle, Type type)

   at System.Runtime.Serialization.DataContract.DataContractCriticalHelper.GetDa

taContractSkipValidation(Int32 id, RuntimeTypeHandle typeHandle, Type type)

   at System.Runtime.Serialization.XsdDataContractExporter.GetSchemaTypeName(Typ

e type)

   at System.ServiceModel.Dispatcher.DataContractSerializerOperationFormatter.Va

lidateDataContractType(Type type)

   at System.ServiceModel.Dispatcher.DataContractSerializerOperationFormatter..c

tor(OperationDescription description, DataContractFormatAttribute dataContractFo

rmatAttribute, DataContractSerializerOperationBehavior serializerFactory)

   at System.ServiceModel.Description.DataContractSerializerOperationBehavior.Ge

tFormatter(OperationDescription operation, Boolean& formatRequest, Boolean& form

atReply, Boolean isProxy)

   at System.ServiceModel.Description.DataContractSerializerOperationBehavior.Sy

stem.ServiceModel.Description.IOperationBehavior.ApplyDispatchBehavior(Operation

Description description, DispatchOperation dispatch)

   at System.ServiceModel.Description.DispatcherBuilder.BindOperations(ContractD

escription contract, ClientRuntime proxy, DispatchRuntime dispatch)

   at System.ServiceModel.Description.DispatcherBuilder.InitializeServiceHost(Se

rviceDescription description, ServiceHostBase serviceHost)

   at System.ServiceModel.ServiceHostBase.InitializeRuntime()

   at System.ServiceModel.ServiceHostBase.OnBeginOpen()

   at System.ServiceModel.ServiceHostBase.OnOpen(TimeSpan timeout)

   at System.ServiceModel.Channels.CommunicationObject.Open(TimeSpan timeout)

   at Service.Host.ServiceController.Start() in C:\Source Code\Test\TestWCF\Serv

ice.Host\ServiceController.cs:line 14

   at Service.Host.Program.Main(String[] args) in C:\Source Code\Test\TestWCF\Se

rvice.Host\Program.cs:line 13

 

The developer’s machine had .NET 4.5 installed. After installing .NET 4.5 to the target machine the problem was resolved. (The other code-based solution was to add an internal set property to the data class as the exception hinted).

Inspecting the .NET code, it can be seen there is a runtime serialization difference between the two versions - .NET 4.5 is more forgiving than the earlier version for this scenario.

 
I haven’t found any documentation about serialization changes between the versions.

Thursday 8 November 2012

SyntaxHighlighter is a very flexible code beautifier, suitable for styling code and SQL on the Web.

SELECT * versus SELECT Col1, Col2 and Query Plans

Yesterday I deployed a colleagues project; a Website and a SQL Server Database to my local machine. I hadn't seen the code before but I'd seen the site in action. The Website used Entity Framework to connect to the underlying database.

When I first ran the Web page today there was no data in the database. So I then ran an SSIS package to populate the database.

Then when I reselected the Web page it timed out when executing the SQL query. Odd, I thought, so I looked at the underlying query. Entity Framework was issuing:

SELECT 
1 AS [C1], 
[GroupBy1].[K2] AS [IntraDayIntLegalEntityName], 
[GroupBy1].[K1] AS [IntraDayIntPortfolioName], 
[GroupBy1].[K3] AS [BaseCurrency], 
[GroupBy1].[A1] AS [C2], 
[GroupBy1].[A2] AS [C3]
FROM ( SELECT 
 [Extent1].[K1] AS [K1], 
 [Extent1].[K2] AS [K2], 
 [Extent1].[K3] AS [K3], 
 SUM([Extent1].[A1]) AS [A1], 
 SUM([Extent1].[A2]) AS [A2]
 FROM ( SELECT 
  [Extent1].[IntraDayIntPortfolioName] AS [K1], 
  [Extent1].[IntraDayIntLegalEntityName] AS [K2], 
  [Extent1].[BaseCurrency] AS [K3], 
  [Extent1].[IntraDayBasePayment] - [Extent1].[YestBasePayment] AS [A1], 
  [Extent1].[IntraDayBasePayment] - [Extent1].[YearEndBasePayment] AS [A2]
  FROM [FxR].[RawDataView] AS [Extent1]
 )  AS [Extent1]
 GROUP BY [K1], [K2], [K3]
)  AS [GroupBy1]

So I ran this query and it took ages. Longer than 15 minutes!

So I started to debug this. This query selected a view which itself did several left joins on a view.
But even more bizarrely, if I did

SELECT * FROM [FxR].[RawDataView] AS [Extent1]

Then the query took 10 seconds. If I re-ran
SELECT [IntraDayIntLegalEntityName] FROM [FxR].[RawDataView] AS [Extent1]

It took longer than 15 minutes! Hold on, why should SELECT * be quicker than Select ?

Time to debug! Enter the query plan. It was clear that the query plans were different for the two queries. Looking through the query plan (right to left, top to bottom) I could see that the faster query seemed to only pass 3000 rows early on, whereas the slow query passed 100,000 rows several times. This helped explain the slowness.


Also running the command

SET STATISTICS IO ON

Showed the following information:

Table 'TradeData'. Scan count xx, logical reads xxxxx.

The scan count and logical reads were vastly higher for the slow query.

Then, finally, I spotted the clue that led to the solution.
You can see from the screen that the estimated number of rows was vastly different from the actual number of rows. Clearly the query plan was expecting far fewer rows than it got. It was obviously out of date.

Running the query showed when the statistics were last updated:

SELECT
t.name AS Table_Name
,i.name AS Index_Name
,i.type_desc AS Index_Type
,STATS_DATE(i.object_id,i.index_id) AS Date_Updated
FROM
sys.indexes i JOIN
sys.tables t ON t.object_id = i.object_id JOIN
sys.partitions sp ON i.object_id = sp.object_id
WHERE
i.type > 0 and sp.rows > 0
-- AND i.name like '%RawDataView%'
ORDER BY
STATS_DATE(i.object_id,i.index_id) DESC,
t.name ASC
,i.type_desc ASC
,i.name ASC 

And this indicated the statistics were updated yesterday. SQL Server had determined the query plan based upon yesterday's data - when there was none there! After adding a 100,000 rows today the query plan was hopelessly out of date and not applicable.

The solution? I ran

UPDATE STATISTICS Fxr.TradeData WITH FULLSCAN

on each table and suddenly my query went down from 15 minutes to 10 seconds. The morale of the story? Learn to read query plans, and update your statistics if your data changes significantly!