Wednesday, 29 February 2012
HttpPostedFileBase.FileName Property is different depending upon the browser
HttpPostedFileBase.FileName returns the full client path and filename if Internet Explorer is used to upload a file.
If Mozilla Firefox is used, it just contains the filename.
Tuesday, 28 February 2012
Reinstalling MVC3
I had an old version of ASP.NET MVC3 installed on a machine and could not reinstall because of an old RTM installation of Razor ("ASP.NET Web Files"). The solution was to fully uninstall, in conjunction with a bit of registry hacking:
http://blog.williamhilsum.com/2011/03/error-0x80070643-installing-aspnet-mvc.html
http://forums.asp.net/t/1661809.aspx/1
http://blog.williamhilsum.com/2011/03/error-0x80070643-installing-aspnet-mvc.html
http://forums.asp.net/t/1661809.aspx/1
Debugging setup problems with msiinv.exe
http://blogs.msdn.com/b/astebner/archive/2005/07/01/434814.aspx
401.2 error with IIS Express
I was receiving an 401.2 error with IIS Express.
The problem was because the applicationhost.config file had <windowsAuthentication enabled="false"> in the configuration and the application was configured to use Windows Authentication.
The problem was because the applicationhost.config file had <windowsAuthentication enabled="false"> in the configuration and the application was configured to use Windows Authentication.
Monday, 27 February 2012
The connection name 'LocalSqlServer' was not found in the applications configuration or the connection string is empty.
This error is caused when the configuration for the local ASP.NET application providers (security, etc) are configured badly. It may be caused because ASP.NET was initially configured to run with SQLExpress, which was subsequently uninstalled.
Mine was:
and I no longer had SQLExpress installed.
The solution is to find the machine.config / web.config that refers to the bad connection string. This can be a bit of an adventure. I searched all of the config files (machine.config, web.config) for all of the .NET versions at C:\Windows\Microsoft.NET\Framework\xxxx\Config\. In the end I discovered that because I had a 64-bit machine there were 64 bit versions also at C:\Windows\Microsoft.NET\Framework64!
There is of course a simple way of fixing the problem. Open up Inetmgr, and select the "Connection Strings" section. The connection string LocalSqlServer is in there. Edit this to point to your DB (or file). If you need to reinstall the providers into SQL Server, use aspnet_regsql.exe.
To understand which is the underlying file that is being edited, there are two rules:
Are you running 64-bit? If so the config file will be located under C:\Windows\Microsoft.NET\Framework64, and for 32-bit it will be C:\Windows\Microsoft.NET\Framework.
What .NET version is the default IIS server running under? If it is v2.0, then the subfolder will be v2.0.50727, and for 4.0 it will be v4.0.30319.
Then check the machine.config and web.config.
Update
Modifying the web.config did not fix the problem. It was even more complicate than that.
The local application had a <clear/> element in its <connectionstrings> section. It also had RoleManager and Profile sections that each had the <clear/> child element. But it did not clear down all of the necessary sections. A search of LocalSqlServer in the web.config showed that it was being used by other providers, such as the healthMonitoring element.
The clearing of the connectionstrings element in the applications config, would have cleared the LocalSqlServer connection string and thus invalidated these elements inherited from the root web.config.
Mine was:
<add connectionString="data source=.\SQLEXPRESS;Integrated Security=SSPI;AttachDBFilename=|DataDirectory|aspnetdb.mdf;User Instance=true;" name="LocalSqlServer" providerName="System.Data.SqlClient" />
and I no longer had SQLExpress installed.
The solution is to find the machine.config / web.config that refers to the bad connection string. This can be a bit of an adventure. I searched all of the config files (machine.config, web.config) for all of the .NET versions at C:\Windows\Microsoft.NET\Framework\xxxx\Config\. In the end I discovered that because I had a 64-bit machine there were 64 bit versions also at C:\Windows\Microsoft.NET\Framework64!
There is of course a simple way of fixing the problem. Open up Inetmgr, and select the "Connection Strings" section. The connection string LocalSqlServer is in there. Edit this to point to your DB (or file). If you need to reinstall the providers into SQL Server, use aspnet_regsql.exe.
To understand which is the underlying file that is being edited, there are two rules:
Are you running 64-bit? If so the config file will be located under C:\Windows\Microsoft.NET\Framework64, and for 32-bit it will be C:\Windows\Microsoft.NET\Framework.
What .NET version is the default IIS server running under? If it is v2.0, then the subfolder will be v2.0.50727, and for 4.0 it will be v4.0.30319.
Then check the machine.config and web.config.
Update
Modifying the web.config did not fix the problem. It was even more complicate than that.
The local application had a <clear/> element in its <connectionstrings> section. It also had RoleManager and Profile sections that each had the <clear/> child element. But it did not clear down all of the necessary sections. A search of LocalSqlServer in the web.config showed that it was being used by other providers, such as the healthMonitoring element.
The clearing of the connectionstrings element in the applications config, would have cleared the LocalSqlServer connection string and thus invalidated these elements inherited from the root web.config.
Friday, 24 February 2012
Bug in OpenLink Endur Simulation runner
There is a bug in the OpenLink Endur simulation runner.
If you call the
method and your simulation contains a User Defined Simulation Result (UDSR), then it excludes the UDSR results from your resultset.
If you call the
public SimResults Run(Transactions trans)
method and your simulation contains a User Defined Simulation Result (UDSR), then it excludes the UDSR results from your resultset.
The solution is to use the Run overload that takes a QueryResult. Populate the QueryResult with the appropriate Transaction IDs. |
Monday, 20 February 2012
Debugging SSIS packages
My SSIS package was doing a large ETL from an OLE DB source to a destination OLE DB. I received the error:
The package started to error. So I redirected the error output from the destination OLE DB to a flat file.
The error columns indicated the following:
Error Code -1071607685 Error Column 0 Error Desc No status is available.
Yet inserting the first row manually of the error output through SQL Management Studio succeeded. What was going wrong? I could insert the row manually, but it was in the error output.
The thing to notice here is that the OLE DB Destination load is using the "Fast Load" process and has a large commit size. That means if one row errors then all of them in the commit size get redirected to the error output.
An improved technique to find the true error rows is to reduce the "Rows per batch" and "Maximum insert commit size" to 10,000 rows. Redirect the error output to another OLE DB load task, but this time instead of setting "Table or View - fast load", select "Table or view". This will load rows individually. Redirect the error output of this to a flat file and this time you will get the exact rows that fail. This technique provides initial speed, but then if an error occurs it tries a finer insert.
In my case it was a foreign key constraint that was not being satisfied.
Footnote
I also discovered by changing the destination to a SQL Server destination that four columns had the wrong data type. The destination row was a SMALLINT whereas the data flow path was using an Signed Int. Some posts indicate that a cast error can cause similar errors and it may occur if the data flow value exceeds the size of the destination column, so it is worth double-checking the data sizes. It shows that the OLE DB destination is less strict than the SQL Server destination.
The attempt to add a row to the Data Flow task buffer failed with error code 0xC0047020.
The package started to error. So I redirected the error output from the destination OLE DB to a flat file.
The error columns indicated the following:
Error Code -1071607685 Error Column 0 Error Desc No status is available.
Yet inserting the first row manually of the error output through SQL Management Studio succeeded. What was going wrong? I could insert the row manually, but it was in the error output.
The thing to notice here is that the OLE DB Destination load is using the "Fast Load" process and has a large commit size. That means if one row errors then all of them in the commit size get redirected to the error output.
An improved technique to find the true error rows is to reduce the "Rows per batch" and "Maximum insert commit size" to 10,000 rows. Redirect the error output to another OLE DB load task, but this time instead of setting "Table or View - fast load", select "Table or view". This will load rows individually. Redirect the error output of this to a flat file and this time you will get the exact rows that fail. This technique provides initial speed, but then if an error occurs it tries a finer insert.
In my case it was a foreign key constraint that was not being satisfied.
Footnote
I also discovered by changing the destination to a SQL Server destination that four columns had the wrong data type. The destination row was a SMALLINT whereas the data flow path was using an Signed Int. Some posts indicate that a cast error can cause similar errors and it may occur if the data flow value exceeds the size of the destination column, so it is worth double-checking the data sizes. It shows that the OLE DB destination is less strict than the SQL Server destination.
Thursday, 16 February 2012
Modifying OneNote's Windows shortcut keys
I use SlickRun, so was rather surprised after one reboot when OneNote 2010 claimed exclusive control over my Windows+S shortcut key.
Fortunately it can be remapped, as described here.
Fortunately it can be remapped, as described here.
Sunday, 12 February 2012
FreeNAS - remounting a broken SoftRAID
In FreeNAS, I stopped my RAID by removing one of the disks. It caused great consternation when the mount point dissappears and you get various warnings across the UI, such as:
and:
and the RAID shows as "Stopped". Attempts to restart it fail.
Browsing the disk using the File Manager did not show any data at the mount point. At this point the panic set it as I thought I had lost my data.
However, all is not lost. You can still access the data. The mount point just needs re-adding.
I deleted the RAID (Disks > Software RAID > Delete) - ignoring the warning message.
Then select the individual disk and re-add the mount point.
Disk management |
and:
Mount point |
and the RAID shows as "Stopped". Attempts to restart it fail.
Browsing the disk using the File Manager did not show any data at the mount point. At this point the panic set it as I thought I had lost my data.
However, all is not lost. You can still access the data. The mount point just needs re-adding.
I deleted the RAID (Disks > Software RAID > Delete) - ignoring the warning message.
Then select the individual disk and re-add the mount point.
Saturday, 11 February 2012
SQL Server, the XML data type and DTDs
SQL Server provides limited support for DTDs in the XML data type.
Take the following XHTML EMail message:
<?xml version="1.0" encoding="utf-16"?>
<!DOCTYPE html PUBLIC "-//W3C//DTD XHTML 1.0 Transitional//EN" "http://www.w3.org/TR/xhtml1/DTD/xhtml1-transitional.dtd">
<html xmlns="http://www.w3.org/1999/xhtml">
<head>
<title>My Title</title>
</head> <body>
<p>My Body</p> </body>
</html>
If you try to insert this as a string into the XML data column, you first might get the error:
Parsing XML with internal subset DTDs not allowed. Use CONVERT with style option 2 to enable limited internal subset DTD support. The solution to this is to perform a convert, specifying a style flag for the XML data type.
INSERT INTO MyTable (MyXmlColumn) VALUES CONVERT(xml, @MyXml, 2)
However, this strips XML DTD fragments from the insert. You get the message:
XML DTD has been stripped from one or more XML fragments. External subsets, if any, have been ignored.
And the resulting data in the column is without the xml declaration and the Document Type Definition:
<html xmlns="http://www.w3.org/1999/xhtml">
<head>
<title>My Title</title>
</head> <body>
<p>My Body</p> </body>
</html>
If your intention is to store this XML data purely for sending an XHTML message, and you don't need to perform XML functions (XSLT, XPath etc.) then you really are better off storing it as a nvarchar(max).
Take the following XHTML EMail message:
<?xml version="1.0" encoding="utf-16"?>
<!DOCTYPE html PUBLIC "-//W3C//DTD XHTML 1.0 Transitional//EN" "http://www.w3.org/TR/xhtml1/DTD/xhtml1-transitional.dtd">
<html xmlns="http://www.w3.org/1999/xhtml">
<head>
<title>My Title</title>
</head> <body>
<p>My Body</p> </body>
</html>
If you try to insert this as a string into the XML data column, you first might get the error:
Parsing XML with internal subset DTDs not allowed. Use CONVERT with style option 2 to enable limited internal subset DTD support. The solution to this is to perform a convert, specifying a style flag for the XML data type.
INSERT INTO MyTable (MyXmlColumn) VALUES CONVERT(xml, @MyXml, 2)
However, this strips XML DTD fragments from the insert. You get the message:
XML DTD has been stripped from one or more XML fragments. External subsets, if any, have been ignored.
And the resulting data in the column is without the xml declaration and the Document Type Definition:
<html xmlns="http://www.w3.org/1999/xhtml">
<head>
<title>My Title</title>
</head> <body>
<p>My Body</p> </body>
</html>
If your intention is to store this XML data purely for sending an XHTML message, and you don't need to perform XML functions (XSLT, XPath etc.) then you really are better off storing it as a nvarchar(max).
Windows 7 and Playstation 3 (PS3) Media Servers
For the Playstation 3 to detect a Windows 7 PC as a Media Server, that
Windows 7 PC must be configured to be a member of a home network. If you
set it to Work it will not be detected.
FreeNAS UPNP server not running - cannot browse administrative WebGUI.
My FreeNAS uPNP server was configured and enabled, but I could not browse to the Administrative WebGUI.
Looking at the Services > Status menu, the UPnP service was enabled but the status showed a red bar.
In the end I figured it out - the content DB folder was missing. Somehow (I don't know how yet) it got deleted. Which meant the service was set up, but failed to launch. The diagnostics and logs did not help either, they didn't show any problems.
Looking at the Services > Status menu, the UPnP service was enabled but the status showed a red bar.
In the end I figured it out - the content DB folder was missing. Somehow (I don't know how yet) it got deleted. Which meant the service was set up, but failed to launch. The diagnostics and logs did not help either, they didn't show any problems.
Creating a FAT16 partition on a USB flash drive greater than 4GB
I wanted to create a FAT16 partition on a large capacity (> 4GB) USB flash drive.
Unfortunately Windows only allows you to format the full capacity on removable USB media and the maximum size of a FAT16 partition is 4GB. In the end I created a GParted Live CD and used GParted to create a 4GB FAT16 partition on the Flash drive.
Unfortunately Windows only allows you to format the full capacity on removable USB media and the maximum size of a FAT16 partition is 4GB. In the end I created a GParted Live CD and used GParted to create a 4GB FAT16 partition on the Flash drive.
Disk Manipulation Tools
GParted - Create Partitions
MBRWiz - Modify the MBR
MBRWiz - Modify the MBR
HP DL160 SoftPaq halts on "Loading FreeDOS" when booting from USB
I used the HP SoftPaq to create a bootable USB key to install a ROM
update. But the system halts on "Loading FreeDOS". I suspected that this
was because the USB was of a large capacity (30GB), so I ended up
trying various ways of creating a bootable USB that was lower capacity
and / or didn't use FreeDOS.
In the end the following worked:
Use the HP USB Disk Storage Format Tool.
Create a FAT32 partition, making it bootable by copying Windows 98 boot files onto it.
Then simply copy the Softpaqs onto the media, using the flat files folder.
Reboot the server. The USB boots as drive A, and you can see the SoftPaqs within the drive.
In the end the following worked:
Use the HP USB Disk Storage Format Tool.
Create a FAT32 partition, making it bootable by copying Windows 98 boot files onto it.
Then simply copy the Softpaqs onto the media, using the flat files folder.
Reboot the server. The USB boots as drive A, and you can see the SoftPaqs within the drive.
Mounting an external NTFS USB drive with FreeNAS
I tried to mount an en external USB drive with FreeNAS. The drive was partitioned as a dynamic disk in Windows 7 and formatted as NTFS.
Numerous attempts to mount the drive with ntfs-3g and other mount commands all failed. I received messages like "Operation Failed" and "The device xxx doesn't seem to have a valid NTFS" and "NTFS Signature Missing".
In the end I repartitioned the drive, this time selecting a Simple Volume and a MBR disk in Windows 7, reformatting as NTFS. The limitation with this approach is that the disk can only have 2TB partition, but it was enough.
To mount the drive and copy files to it, I used the following commands.
If you want to exclude folders then use
Check the size of the copied folder with du -hs "My Photos"
Footnote: I'm starting to lose the will to live with FreeBSD and FreeNAS. There are so many petty complications. For example, RSYNC sometimes fails to work; it reports "Sending File List" and then goes idle. A look at the log indicates "root: Previous local synchronization still running... exiting".
ps -eaf tells me that RSYNC is in an "uninterruptable sleep" (status D).
The solution? Reboot, then rm rsync* from /var/run.
Another problem is that I occassionally get:
rsync: recv_generator: mkdir "/mnt/usb/Data/a/b/c" failed: No such file or directory (2)
*** Skipping any contents from this failed directory ***
Numerous attempts to mount the drive with ntfs-3g and other mount commands all failed. I received messages like "Operation Failed" and "The device xxx doesn't seem to have a valid NTFS" and "NTFS Signature Missing".
In the end I repartitioned the drive, this time selecting a Simple Volume and a MBR disk in Windows 7, reformatting as NTFS. The limitation with this approach is that the disk can only have 2TB partition, but it was enough.
To mount the drive and copy files to it, I used the following commands.
mkdir /mnt/usb kldload fuse ntfs-3g /dev/da1s1 /mnt/usb rsync -av --ignore-errors --exclude .recycle /mnt/Data /mnt/usb
If you want to exclude folders then use
rsync -av --ignore-errors --exclude Andrew/Bulgaria /mnt/Data /mnt/usbNote that the inclusion and exclusion of the slashes is important, and the folder is relative to the source folder that was specified.
Check the size of the copied folder with du -hs "My Photos"
Footnote: I'm starting to lose the will to live with FreeBSD and FreeNAS. There are so many petty complications. For example, RSYNC sometimes fails to work; it reports "Sending File List" and then goes idle. A look at the log indicates "root: Previous local synchronization still running... exiting".
ps -eaf tells me that RSYNC is in an "uninterruptable sleep" (status D).
The solution? Reboot, then rm rsync* from /var/run.
Another problem is that I occassionally get:
rsync: recv_generator: mkdir "/mnt/usb/Data/a/b/c" failed: No such file or directory (2)
*** Skipping any contents from this failed directory ***
Friday, 10 February 2012
Booting an Phoenix Award BIOS v6.00pg from a USB pen drive
I had some difficulties getting a low power PC to boot Freenas from USB.
These instructions describe how to prepare a USB pen drive and a PC for booting Freenas from USB.
Use DISKPART to LIST DISK, SELECT DISK <number> and CLEAN the disk (careful to choose the USB disk). This ensures there are no partitions on the disk.
Format the USB pen drive with HPUSBDisk.exe.
Format the disk with FAT32.
Use PhysDiskWrite and PhysGUI.exe to burn the Freenas image to the USB disk.
Check the computer's BOIS settings:
Phoenix - AwareBOIS
Advanced BIOS Features
Hard Disk Boot Priority:
Second Boot Device: Disabled
Third Boot Device: Disabled
Boot Other Device: Disabled
Integrated Peripherals
Onboard Device Function >
USB Mass Storage Device Boot Setting > USB Flash Drive 0.0 : FDD Mode
USB Keyboard Legacy Support: Enabled
USB Mouse Legacy Support: Enabled
USB Storage Legacy Support: Enabled
I don't know whether it helped, but I had a PS2 keyboard plugged in instead of a USB keyboard.
These instructions describe how to prepare a USB pen drive and a PC for booting Freenas from USB.
Use DISKPART to LIST DISK, SELECT DISK <number> and CLEAN the disk (careful to choose the USB disk). This ensures there are no partitions on the disk.
Format the USB pen drive with HPUSBDisk.exe.
Format the disk with FAT32.
Use PhysDiskWrite and PhysGUI.exe to burn the Freenas image to the USB disk.
Check the computer's BOIS settings:
Phoenix - AwareBOIS
Advanced BIOS Features
Hard Disk Boot Priority:
- 1. USB-HDD0 A-Data USB Flash Driv
- 2. Bootable Add-in Cards
- 3. SATA1
- 4. SATA 2
Second Boot Device: Disabled
Third Boot Device: Disabled
Boot Other Device: Disabled
Integrated Peripherals
Onboard Device Function >
USB Mass Storage Device Boot Setting > USB Flash Drive 0.0 : FDD Mode
USB Keyboard Legacy Support: Enabled
USB Mouse Legacy Support: Enabled
USB Storage Legacy Support: Enabled
I don't know whether it helped, but I had a PS2 keyboard plugged in instead of a USB keyboard.
Monday, 6 February 2012
Useful Visual Studio add-ins
- Ghost Doc
- BIDS Helper
- Productivity Power Tools
- PowerCommands for Visual Studio 2010
Processing a SSAS cube after dimension changes
If you change the dimensions and fail to process the SSAS database fully, you may get an error similar to this:
Errors in the OLAP storage engine: The attribute key cannot be found when processing: Table: '<table>', Column: '<column>', Value: '<value>'. The attribute is '<attribute name>'. Errors in the OLAP storage engine: The record was skipped because the attribute key was not found. Attribute: <attribute name> of Dimension: <dimension name> from Database: <database>, Cube: <cube>, Measure Group: <measure group>, Partition: <partition>, Record: <record number>.
Processing the cube does not automatically process the dimensions.
Errors in the OLAP storage engine: The attribute key cannot be found when processing: Table: '<table>', Column: '<column>', Value: '<value>'. The attribute is '<attribute name>'. Errors in the OLAP storage engine: The record was skipped because the attribute key was not found. Attribute: <attribute name> of Dimension: <dimension name> from Database: <database>, Cube: <cube>, Measure Group: <measure group>, Partition: <partition>, Record: <record number>.
Processing the cube does not automatically process the dimensions.
If you
have changed the dimensions, then you should process the database.
Sunday, 5 February 2012
Using the Welcome screen to show all users on a Windows 7 domain client
I have a Windows 7 laptop which is a client of my home domain. The kids like to use it and don't like logging in with their username all the time. When a Windows 7 machine becomes a client of a domain you lose the Welcome screen which displays the last logged in users and instead you get the classic login.
Fortunately you can restore it through group policy; this is described here.
Fortunately you can restore it through group policy; this is described here.
Subscribe to:
Posts (Atom)