Monday, May 16, 2011

Dongles

I hate dongles.

They're a real pain in the ass.  Expensive to replace if lost or broken.  Most require a PC dedicated to their function.

Despite my efforts to eliminate dongles altogether, I still have 3 left to worry about, and they're critical at that.  One is used to licence our Attendance Enterprise payroll software by Infotronics, another licenses our Avaya IP Office 403 telephone system, and the last licenses Performance Series by Diagraph which prints all of our stock labels.  Over the years, I've had to move these dongles from one PC to another as hard drives and mainboards died, Windows needed reinstallation, users change positions, etc.

We recently migrated from full Windows desktops to a terminal server, which left nowhere to host the dongles.  Sure, I could setup a PC just for that, and eventually that's exactly what I did.  I found the Fit-PC at www.fit-pc.com, a small and inexpensive x86 compatible PC that fits (pun intended) nicely anywhere in the rack I choose to put it and uses very little power.

I spent several hours trying to install the Attendance Enterprise Security Manager, a system service which validates the license according to the dongle and plants a record in the database of the AE application.  I learned a long time ago that even though the service itself is merely a single executable with its own configuration registry key, it requires the installation of the full application in order to work properly, or it will throw bogus errors that suggest the dongle is not present.

The problem this time was the installer, which eventually demands the presence of the dongle to proceed and merely disables the Next buttons until it is found.  Despite using the Sentinal Advanced Medic to validate that the dongle was working, the installer would not find it.  Procmon yielded no clues.  Monitoring USB activity was worthless.  I tried setting up a VMware virtual machine to duplicate the issue, and did exactly that.

Finally after several hours and starting over several times, I decided while installing the Sentinel Protection drivers to install the parallel port drivers in addition to the USB drivers.  See, normally I wouldn't think to install the parallel port drivers on a machine with no parallel port (like a Fit-PC), moreover I would expect that to fail.  It does not fail, and in fact the installer suddenly and magically finds the USB dongle once the PARALLEL drivers are present.  Nice one, guys.

But, once installed, it fails to start.  The AESECURITY.EXE pops a BEX error and must be added to the Data Execution Protection exclusion list to work.  Way to go.

And finally, even though the installer pretends to support Windows authentication to the remote SQL Server, it fails to include Integrated Security=SSPI in the connection string.  Fortunately this is easy to fix in the registry.

Congrats, Infotronics.  You've just topped my list of stupid software.  Your developers should be embarassed.

Time, effort, and actual expense (in the form of a PC that does nothing except host the dongles, for example) aside, the biggest problem I have with dongles is that they fundamentally transfer the cost of anti-piracy from the manufacturer onto the customer, and not just once but on an ongoing basis.

Instead of an Internet based activation and licensing service (paid for in some fashion by the software manufacturer), of which there are many that work well, the customer is forced to pay for the dongle as part of the software package, and maintain its operation and relocation over time across desktop migrations, upgrades, and hardware maintenance.  Even if the software manufacturer goes out of business and disappears, the customer continues to bear the cost of the dongle even though the license is no longer valid and no company is left to care about potential piracy.  And good luck if the dongle breaks then.

I will never again buy a piece of hardware or software that requires a dongle.  Any company that would force me to isn't a company with which I want to do business.

Saturday, June 26, 2010

UPS Worldship 2010 on Windows 7

Running UPS Worldship 2010 in a shared network configuration, i.e. Administrative station with Remote stations, with UAC enabled runs into an issue on Windows Vista and Windows 7.  I have personally only experienced this on Windows 7, mostly because we ignored Windows Vista entirely in our organization, but since the root cause is the behavior of UAC it stands to reason that Windows Vista may be similarly affected.

The following assumes that UPS Worldship is being run from an account that is not Administrator but is included in the local Administrators user group.  Much of the following can not be demonstrated while logged on as Administrator because elevation of Command prompts and some software is automatic and not a choice.

In a shared network configuration, UPS Worldship stores some of its files on a mapped network share.  After self-installing a patch delivered via UPS mail, the patch installer will demand UAC elevation, and when complete will automatically relaunch the UPS Worldship application, which will fail claiming to be unable to read an INI file that should be located on the shared network space.  Launching UPS Worldship manually after the patch is successful.

Attempting to launch UPS Worldship by using Run As Administrator produces the same error.  This makes sense, since the elevated patch installer would also launch UPS Worldship in an elevated state upon completion of the patch.  Other symptoms are errors while launching the UPS Worldship Support Tools, which demands UAC elevations and then tries to read configuration files from the mapped network share and fails with similar errors.

The root issue is exposed easily by opening an unelevated Command prompt and typing NET USE showing mapped drives as connected without error.  Then open an elevated Command prompt and again type NET USE revealing the same mapped drives as still present but broken in an error state, unable to connect.

This issue is commonly misreported and misunderstood as a symptom of user specific drive mappings, where a user who maps to network drives and subsequently logs off and back on as another user (perhaps Administrator) will not see those maps drives, as they are stored in the user registry and are therefore user specific.  It seems that Run As Administrator is a somewhat misunderstood function in Windows 7.  Some apparently believe this to be identical to using Run As in Windows XP and providing alternate credentials, typically a local or domain Administrator, after which the spawned process would be running under the alternate user context, complete with that alternate user's registry including mapped drives and such.

However, Run As Administrator in Windows 7 retains the current user context and registry but demands elevation for the spawned process instead.  The evidence of this difference in the Command prompt example above is simply that the NET USE under an elevated Command prompt shows the same mapped drives as seen in the unelevated Command prompt.  The user context, and thus the registry, is not changed, but for some reason the elevated security token can not access the mapped drives whereas the unelevated security token (provided by default) can.

If Run As Administrator on Windows 7 behaved as some people mistakenly believe it does, i.e. identical to Run As on Windows XP and using Administrator credentials, then the NET USE in the Run As Command prompt would not show any mapped network shares at all, unless of course you logged on as Administrator and mapped them separately.

The fix for this is documented at http://technet.microsoft.com/en-us/library/ee844140(WS.10).aspx.  Once the EnableLinkConnections fix is implemented, the elevated Command prompt will be able to access the mapped drives and UPS Worldship, and other programs that use mapped network shares, will function normally even while elevated.

I am somewhat disappointed, but not surprised, that UPS Worldship does not implement this fix as part of its installation.  The expectation from UPS is that Worldship will be executed from the Administrator account itself, not just from a user account that is a member of the local Administrators group.  However, the need for a mapped network share will typically force the use of a domain account as the primary user context for running Worldship, and best practices on the part of many Administrators will cause them to install the software while logged on as local Administrator or a member of their Domain Admins group, and then switch to their domain user for launching the software.  Unfortunately since UPS Worldship is self-patching, inevitably that domain user must be included in the local Administrators group for convenience sake, even though the software itself can operate otherwise.  It's one of many cases where developers take the easy way out for themselves by demanding local administrative rights for the logged on user and creating additional risk at the desktop instead of doing the work to follow security best practices and the Principle of Least Privilege.

Thursday, June 24, 2010

Tiered SQL Server Backups Using Task Scheduler

Sometimes using the Task Scheduler to execute tiered (read: Grandfather Father Son) SQL Server backups to disk is preferred to creating a maintenance plan.  While maintenance plans are powerful and easy, they require Integration Services to be installed, and in the case of standalone SQL Server Express installations that can be troublesome.

I use the following script as the foundation for my Task Scheduler jobs.

DECLARE @backupType NVARCHAR(3) -- BAK, DIF, or TRN
DECLARE @deleteDays INT

SET @backupType = N'$(backupType)'
SET @deleteDays = $(deleteDays)
DECLARE @dateDeleteBefore SMALLDATETIME
SET @dateDeleteBefore = DATEADD(dd, 0-@deleteDays, CURRENT_TIMESTAMP)

DECLARE @strDate NVARCHAR(20)

SET @strDate = CONVERT(VARCHAR(20),CURRENT_TIMESTAMP,120)
SET @strDate = REPLACE(@strDate, N'-', N'')
SET @strDate = REPLACE(@strDate, N':', N'')
SET @strDate = REPLACE(@strdate, N' ', N'')

DECLARE @backupDir NVARCHAR(4000)

EXEC master.dbo.xp_instance_regread
  N'HKEY_LOCAL_MACHINE',
  N'Software\Microsoft\MSSQLServer\MSSQLServer',N'BackupDirectory',
  @backupDir OUTPUT,
  N'no_output'

DECLARE cursorDb CURSOR FOR
SELECT name
  FROM master.dbo.sysdatabases
  WHERE name NOT IN (N'tempdb')

DECLARE @name NVARCHAR(50)
DECLARE @path NVARCHAR(256)
DECLARE @model NVARCHAR(32)

OPEN cursorDb
FETCH NEXT FROM cursorDb INTO @name
WHILE @@FETCH_STATUS = 0
BEGIN
  SELECT @model = recovery_model_desc FROM master.sys.databases WHERE name = @name

  IF @backupType = N'BAK' OR (@backupType = N'DIF' and @name = N'master')
  BEGIN
    SET @path = @backupDir + N'\' + @name + N'_' + @strDate + N'.BAK'
    BACKUP DATABASE @name TO DISK = @path
  END
  ELSE IF @backupType = N'DIF'
  BEGIN
    SET @path = @backupDir + N'\' + @name + N'_' + @strDate + N'.DIF'
    BACKUP DATABASE @name TO DISK = @path WITH DIFFERENTIAL
  END
  ELSE IF @backupType = N'TRN' AND @model = N'FULL'
  BEGIN
    SET @path = @backupDir + N'\' + @name + N'_' + @strDate + N'.TRN'
    BACKUP LOG @name TO DISK = @path
  END
  FETCH NEXT FROM cursorDb INTO @name
END

CLOSE cursorDb
DEALLOCATE cursorDb
EXEC master.dbo.xp_delete_file 0, @backupDir, @backupType, @dateDeleteBefore, 0

GO

Call the script using SQLCMD like in the following examples and schedule as appropriate.
 
SQLCMD.EXE -S .\INSTANCE -i C:\yourpathtoscript\BACKUP.SQL -v BACKUPTYPE = BAK -v DELETEDAYS = 14
 
SQLCMD.EXE -S .\INSTANCE -i C:\yourpathtoscript\BACKUP.SQL -v BACKUPTYPE = DIF -v DELETEDAYS = 7
 
SQLCMD.EXE -S .\INSTANCE -i C:\yourpathtoscript\BACKUP.SQL -v BACKUPTYPE = TRN -v DELETEDAYS = 3

The BACKUPTYPE variable controls the type of SQL Server backup.  All databases (except TempDb) in the instance are included, and attempts to execute transaction log (TRN) backups against databases that are set to SIMPLE or BULK LOGGED are skipped.  Since the master database can not get a differential backup, using DIF produces a full backup instead with the BAK extension.  All backups are stored in the default backup directory for the SQL Server instance, as found in the registry and specified during installation of the instance.  The DELETEDAYS variables indicates how to purge old backups of the same type from the directory.

This script runs great under the SYSTEM context when run from the machine on which the instance is installed, which means you don't have to store passwords.

Wednesday, May 26, 2010

Microsoft Dynamics AX 4.0 and Analysis Services 2008

Using Microsoft Dynamics AX 4.0 with SQL Server 2008 or SQL Server 2008 R2 produces a couple of errors that are easily fixed.


First, when trying to Transfer or Process a Cube Instance, the following error occurs.


Method 'setServer' in COM object of class '{A4D49012-F5A5-4d86-8978-863A41277677}' returned error code 0x80070002 () which means: Could not load file or assembly 'Microsoft.AnalysisServices, Version=9.0.242.0, Culture=neutral, PublicKeyToken=89845dcd8080cc91' or one of its dependencies. The system cannot find the file specified.



One or more required client components that support OLAP analysis are not installed on this computer. For details on the required components, in Administrator Help, see Set up OLAP reporting.








The ax32.exe binary is hard coded to reference the Microsoft.AnalysisServices .NET assembly version 9.0.242.0, but this can be fixed with a binding redirect.


Navigate to %programfiles%\Microsoft Dynamics AX\40\client\bin\ and create a file named ax32.exe.config with the following contents.


<?xml version="1.0" encoding="utf-8" ?> 
<configuration>
  <runtime> 
    <assemblyBinding xmlns="urn:schemas-microsoft-com:asm.v1"> 
      <dependentAssembly> 
        <assemblyIdentity name="Microsoft.AnalysisServices" publicKeyToken="89845dcd8080cc91" culture="neutral"/> 
        <bindingRedirect oldVersion="0.0.0.0-9.9.0.0" newVersion="10.0.0.0"/> 
      </dependentAssembly> 
    </assemblyBinding> 
  </runtime> 
</configuration>






This forces the ax32.exe to use the Microsoft.AnalysisServices .NET assembly version 10.0.0.0 instead.

Even with this in place, the following error still occurs.

One or more required client components that support OLAP analysis are not installed on this computer. For details on the required components, in Administrator Help, see Set up OLAP reporting.

In class OLAPServerControlAMO the method AMOAvailable checks to see if components are present before creating a new OLAPAMOManager class.  The code looks for a registry signature only found with 9.0.242.0 and needs to be updated to look for newer Microsoft.AnalysisServices .NET assembly signatures.  The following updated method works.

static boolean AMOAvailable()
{
    boolean     valid = false;
    #winapi
    container reg   = connull();
    int       handle;
    str       keyValue;

    // This method checks to see if the registry contains a key from the Server object
    // of the AMO dll. If it does, it checks the key value against the version of the
    // object that is to be used. If it is a match, then AMO is considered available
    // and true is returned.
    handle  = WinAPI::regOpenKey(#HKEY_CLASSES_ROOT, 'Microsoft.AnalysisServices.server\\Clsid', #KEY_READ);
    if (handle)
    {

        reg     = WinAPI::regGetValue(handle, '');
        keyvalue = conpeek(reg,1);
        if (keyValue == '{554BBCA3-925F-4797-9460-2421A8CD7030}')
            valid = true;
        WinAPI::regCloseKey(handle);
    }

    handle = WinAPI::regOpenKey(#HKEY_LOCAL_MACHINE, 'SOFTWARE\\Classes\\Installer\\Assemblies\\Global', #KEY_READ);
    if (handle)
    {
        reg = WinAPI::regGetValue(handle, 'Microsoft.AnalysisServices,fileVersion="10.0.2531.0",version="10.0.0.0000",culture="neutral",publicKeyToken="89845DCD8080CC91",processorArchitecture="MSIL"');
        if (conlen(reg) &gt; 0)
            valid = true;

        reg = WinAPI::regGetValue(handle, 'Microsoft.AnalysisServices,fileVersion="10.50.1600.1",version="10.0.0.00000",culture="neutral",publicKeyToken="89845DCD8080CC91",processorArchitecture="MSIL"');
        if (conlen(reg) &gt; 0)
            valid = true;

        WinAPI::regCloseKey(handle);
    }

    return valid;
}

Notice that it checks for both the Analysis Services 2008 (10.0.2531.0) and Analysis Services 2008 R2 (10.5.1600.1) assembly.

With these two fixes in place and the proper components installed, Transfer and Process should work.