Wednesday, October 15, 2008

Enabling fields and events in bulk edit forms

You may have noticed that in CRM 4.0 the JavaScript events do not fire in Bulk Edit forms. It doesn't seem a big issue, because until today I haven't heard any complaints about it - or I already forgot them. Anyway, there are two things about the Bulk Edit form that you should know:

Any event handling code (OnLoad, OnSave, OnChange) is disabled by default, but you can get them working again.
Any field having an OnChange event handler applied is disabled on the form. You can fix that as well.
Click here to read more

Setting up a “mirror” of a production environment for dev/testing

Posted by John Straumann

I recently needed to set up a “mirror” of a production system for use in development and testing. I had worked with the Redeployment tool in CRM v3 and I was interested to see how v4 worked. I was very happy when the v4 tool was *very* easy to use, and here are the steps I followed.

1. Used SQL 2005 to do a backup of the CRM v4 production databases
2. Using the VPC image distributed by Microsoft, I copied the .bak file to the VPC. I then did a restore of the database files to the VPC SQL instance. For some reason the SQL GUI did not work, so I used the following command (Names changed):
restore database OrgName_MSCRM from disk='C:\Temp\OrgName_MSCRM.bak' WITH MOVE 'OrgName_MSCRM' TO 'C:\Program Files\Microsoft SQL Server\MSSQL.1\MSSQL\Data\ OrgName _MSCRM.mdf', MOVE ' OrgName _MSCRM_log' TO 'C:\Program Files\Microsoft SQL Server\MSSQL.1\MSSQL\Data\ OrgName _MSCRM_log.LDF'

3. Once the DB was successfully restored on the VPC image, I fired up the CRM Deployment Manager, right-clicked on the “Organizations” item, and selected “Import Organization”
4. The next step asked me to specify the SQL server, and as soon as I did that the deployment manager found the organization I wanted to import
5. The next screen asked me for a name and a display name
6. Next input was the report server URL
7. The next steps involved the mapping of users. There are several options for this tep, I chose to simply “Manually Map Users”, and the only user I mapped was the Administrator which was mapped to the Litwareinc Administrator on the VPC. I got a warning that “All users are not mapped. Do you wish to continue?” and just clicked “Yes”
8. The next screen was simply the tasks verification screen, so I clicked “Next” and then on the next screen “Import”
9. And that was it! It took a bit of time and at the end I am able to access the org via http://moss:5555/orgname/loader.aspx

You can see the screens here but please note names are hidden:

http://www.mscrmguy.com/redeploy/redeploy.html

Data Migration Manager Tips and Tricks

Sonal Sawhney Published Tuesday, September 02, 2008


href="http://blogs.msdn.com/crm/archive/2008/01/04/microsoft-dynamics-crm-4-0-data-migration-manager-with-john-o-donnell.aspx">Data migration Manager (DMM) users who are looking for tips and tricks to improve their experience with DMM.

Boost Data Migration Manger’s (DMM) Performance

1. All the machines in the system must be in the same time zone and use same regional settings.

2. In the Internet Options -> Connections -> LAN Settings, set the physically nearest proxy server and set ‘Bypass proxy server for local addresses’. Do not use the option ‘Automatic Configuration’ as it involves in IL code generation (observed for .NET 2.0 SOAP classes).

3. Dynamics CRM 4.0 application has been launched at least once before testing. 

4. All data file size should be less than 32 MB. If you have a source data file that is over 32 MB, split the data into several files (making sure to include the header row in each file) and migrate them separately.

5. While running DMM, do not load the client with other applications.

6. Preferably do a Test run with lesser number of records before proceeding for actual Data Migration, so that you can undo the migration using DMM if any issue happens.

7. DMM will execute slow in a virtual machine (MS virtual server or Vmware) than a similar configuration real hardware.

8. Instead of using SQLExpress, if you have license to use SQLServer then it is better to use DM Client on the same machine where SQL Server is installed.

9. For an On-Premise Install, DMM and CRM server on different machines would enhance performance. If not, then make sure that the DMM and CRM server are on a fast LAN connection with no TCP routing hops or HTTP proxies in between.

10. The version of the Data Migration Manager must match the version of Microsoft Dynamics CRM to which you are connecting.

11. DMM cannot be run at the same time as Microsoft Dynamics CRM for Outlook. To disable CRM in Outlook :

In the notification area, click the Microsoft Dynamics CRM Application Host icon, and then click Disable CRM.  The icon appears grey.

Microsoft Dynamics CRM will be re-enabled the next time you start Outlook.

12. Turn off tracing/logging for better performance.

13. Upload the file IN ORDER of lookups resolution – if account contains lookups of contact then contact should be selected first and then account, this might save time in determining the order of files to be updated (sorting a sorted array takes less time)

Customize DMM

Can I customize Microsoft Dynamics CRM while DMM is running?

If you run a migration after making customizations in Microsoft Dynamics CRM, you might receive data upload errors. To prevent this, after any change in CRM metadata; user should restart Data Migration Manager to sync metadata customization.

In a particular case, when migrated data is for a custom entity and later using web App, if the custom entity is deleted, then Migration Purge using DMM tool will fail.

To succeed, create the deleted custom entity and retry deletion using DMM tool.

How to create attributes of Date-Time type through DMM?

DMM supports creation of Date type attributes. To migrate data in date-time attribute, create the date-time attribute on the server before beginning migration, and map the source attribute to the newly created attribute.

How to migrate Lookup attributes that Reference a Custom Entity Created by the DMM?

The Data Migration Manager does not support migrating records from other record types that refer to a custom entity that is created by the same migration. If you have other files in this migration with records that refer to this entity, you need to cancel this migration, and instead, do two migrations.

Step 1: Migrate this custom entity in the first migration, and the wizard will create it in Microsoft Dynamics CRM.

Step 2: Migrate data having lookup references that refer to this new entity.

When I use non-English characters in a name for a custom attribute, why do I get an "Invalid characters in name" error message?

DMM uses one input for both the database logical name and the display name, and the database logical name must use ASCII characters.

To work around this problem, after you migrate your data, use the Customization area of Microsoft

Dynamics CRM to change the display name.

When I migrate data to a custom attribute, if DMM fails to create the attribute, the first time, and I reuse the data map from this migration, why does my custom attribute data not get migrated?

If a custom attribute is not created the first time you run an Express mode migration, all data in this attribute will be ignored when you reuse the data map from this migration for another Express mode migration.

To work around this problem, when you start DMM using the data map, use Standard mode, so that you can specify creating a new attribute again.

Migrate Specific Attributes

Are you migrating Date/time format source data? Ensure this –

1. If the date/time format of your source data differs from the one set by you in CRM settings ,then consider changing the CRM format settings accordingly before starting migration. It is advisable to take a small number of test records and verify your expectations before starting migration.

2. Microsoft Dynamics CRM 4.0 Data Migration Manager validates the date data in your source data during migration. In cases where the date data is not in the format specified by your date format setting, DMM displays an error.

3. DMM understands the date time settings of the user in whose context it is running. To change the user timezone in CRM go to ToolsàOptions and change your time zone. To change the date format settings go to ToolsàOptionsàFormatsàCustomizeàDate, change and save the settings. These settings will be used by DMM to migrate the dates and times.

How can I preserve the auditing information like created on while migrating data from my CRM system to Microsoft CRM?

In order to migrate data into the “Created On” attribute, you must map your source column that contains this data to the Microsoft Dynamics CRM Record Created On (overridencreatedon) attribute. As the record is migrated, the Created On date will be updated with this value, and the Record Created On value will be set to the date and time the record was actually migrated.

However for a custom Entity created through DMM, data in the Created On column in the corresponding .Csv file cannot be migrated. To work around this limitation, create the custom entity using the Customization area of Microsoft Dynamics CRM, and then migrate data to the custom entity.

How can I migrate data into the Time Zone field?

For migrating the timezone fields, you need to specify the correct ‘code’ for that time zone in the .Csv file, the list of time zones with repective int values can be found in the DM help file.

How can I map state/status or any Boolean type field?

State/Status/Boolean fields are treated just like picklist fields in CRM and have one or more options associated with them. To migrate data into any of these types of fields, the user needs to define picklist mapping along with the column mapping specified.

E.g. Let’s say that the user wants to migrate data to “DoNotEmail” field of Account entity and the source system has a corresponding field by the name of “EmailAllowed” which takes “true”, “false”, and “not specified” as its possible values. The user will have to create picklist mappings mapping source system’s “true”, “false”, “not specified” values to target system’s Boolean options 0, 1 and 0 respectively. Same is required when a user is migrating data to state/status fields.

<AttributeMap Id="">

    <SourceAttributeName>EmailAllowed</SourceAttributeName>

    <TargetAttributeName>donotemail</TargetAttributeName>

    <ProcessCode>Process</ProcessCode>

    <PicklistMaps>

        <PicklistMap Id="">

            <SourceValue>true</SourceValue>

            <TargetValue>0</TargetValue>

            <ProcessCode>Process</ProcessCode>

        </PicklistMap>

        <PicklistMap Id="">

            <SourceValue>false</SourceValue>

            <TargetValue>1</TargetValue>

            <ProcessCode>Process</ProcessCode>

        </PicklistMap>

        <PicklistMap Id="">

            <SourceValue>not specified</SourceValue>

            <TargetValue>0</TargetValue>

            <ProcessCode>Process</ProcessCode>

        </PicklistMap>

    </PicklistMaps>

</AttributeMap>

How to overwrite pick-list mappings (different values) done by “Auto pick-list mapping”?

The DMM gives user an option to customize the CRM system automatically with picklist options that are unmapped. This option will take all the unmapped source picklist values and create pciklist options for them in CRM. The point to note here is that if the picklist value is unmapped only then the pciklist option be added to the CRM system. If the user does not want the CRM system to automatically customize the picklist options for an attribute, then she can create picklist mappings for that attribute mapping all the source picklist options to existing CRM picklist options.

Some records are migrated to Microsoft Dynamics CRM with a default status.

Documented at DMM ReadMe:

http://download.microsoft.com/download/e/f/b/efb71b2d-5b70-470a- 94a10260e460e437/Microsoft_Dynamics_CRM_4.0_Data_Migration_Manager_Readme.htm

How can I migrate data to the QuantitySellingOption drop-down list?

Quantity Selling Option is not customizable in Microsoft Dynamics CRM, so if your data has customized values in this column in your .csv file, it will not be migrated. To workaround this problem, you will need to do one of the following:

Manually edit the data map for the migration to map the customized drop-down list (picklist) values in your price list items .csv file to valid Microsoft Dynamics CRM values. The three valid values are No Control, Whole, and Whole and Fractional.

Edit your price list items .csv file and change the custom values to valid Microsoft Dynamics CRM values

Migrate Users

Why doesn't data from all the attributes in my users .csv file get migrated?

DMM will create new users based on data in a .csv file, but it only migrates the following fields to which you have mapped your source csv columns:

  • Domain logon name
  • First Name
  • Last Name
  • E-mail

You will need to manually enter any other user data.

How to map users and create new users?

Find the details at: http://blogs.msdn.com/crm/archive/2008/02/20/record-owner-information-migration-using-dmm.aspx

HANDLING ERRORS

Some of the records might have been migrated despite errors?

For an entity if the attribute customization fails from DMM, then the records for that entity gets migrated except for the column value for which the custom attribute was created.

In this case all the rows will have error as the custom attribute creation failed, but in reality all the records are migrated but only the custom attribute data is not migrated.

Fix this:

1. At the CRM Server side using Web App, user has to create the custom attribute and manually update all the migrated records with required value for this newly added attribute.

OR

2. Resolve all the reported errors in "Conversion error" screen before starting data Upload using DMM.

Encountered “Generic SQL error” during data conversion?

The most probable cause for this error is the field size limit per attribute. In case your attribute data exceeds the limit, opt for field width customizations by ensuring that the checkbox on the Overview screen of DMM which reads as “To accommodate the source data, automatically customize Microsoft Dynamics CRM list values and length of attributes” is checked. However attributes of type "Ntext", does not support automatic field width customization for attributes with length greater than > 2000. In case of failures, increase the field size by using CRM Application customization and retry to migrate the failed records after exporting the failed records from DMW.

Parsing error: “Row is too long to import"?

Currently there is a limit of 4000 characters on the columns of the data rows in the files used for migration. You will hit above error if any column size exceeds this limit. Here is a workaround:

1. On the machine where you have DMM installed, add a ‘Double’ type registry key called ‘ImportParsedColumnDefaultSize’ with a very large value (like Decimal 10000) under: HKEY_LOCAL_MACHINE\SOFTWARE\Microsoft\Data Migration Wizard

2. On the SQL server which you are using for DMM(or SQL express ) create a database trigger by running the following script:

   1: USE MSCRM_MIGRATION
   2: GO
   3: CREATE TRIGGER DDLConvertToMax ON DATABASE FOR CREATE_TABLE 
   4: AS 
   5:     DECLARE @raisedEventData XML 
   6:     DECLARE @substr XML
   7:     DECLARE @tablename nvarchar(500)
   8:     DECLARE @columnName nvarchar(100)
   9:     DECLARE @sqlstatement nvarchar(1000)
  10:  
  11:     SET @raisedEventData = eventdata()
  12:     SET @substr = @raisedEventData.query ('data(/EVENT_INSTANCE/ObjectName)')
  13:     SET @tablename = CONVERT(nvarchar(200), @substr)
  14:  
  15:     
  16:     declare @i int
  17:     SET @i = 0
  18:     while @i < 24
  19:     BEGIN
  20:         SET @columnName = 'COL' + CONVERT(nvarchar(50), @i)
  21:         if columnproperty(object_id(@tablename), @columnName, 'ColumnId') is not null
  22:         BEGIN
  23:             SET @sqlstatement = N'ALTER TABLE ' + @tablename + N' ALTER COLUMN '+ @columnName + N' nvarchar(max)'
  24:             PRINT @sqlstatement
  25:             EXECUTE sp_executesql @sqlstatement
  26:         END
  27:     SET @i=@i+1
  28:     END  
  29: GO  

Here is what the script does:

1. Operates on the DB, is independent of CRM code

2. Will convert all columns from COL0-COL24 automatically to nvarchar(max)

3. Will check for existence of columns before resizing

4. Operates on the MSCRM_MIGRATION Database

5. Consider running this only for the parse tables that are failing and then delete the trigger

Why do I get an error "An error has occurred? Please see the log file for more information." when I try to import a data map?

If you have a value that is too long in a parameter such as SourceName, Name, or SourceEntityName, you will receive this error. The log file will show a "Generic SQL error." To work around this problem, shorten parameter names in the data map to 160 or fewer characters.

Why do I get error messages about rows with a number greater than the number of rows in my .csv file?

Microsoft Office Excel doesn't display rows with blank data, so if you look at your .csv files with Excel, you will not see empty rows at the end.

To remove empty rows at the end of a .csv file, open it in Notepad.

Why do I see Data Upload Errors with no description?

If your record has a status of Published, it cannot be migrated. Articles must have a Status value of Draft or Unapproved in order to be migrated. After migrating the articles, use Microsoft Dynamics CRM to change the status of the articles.

For the following queries and many other refer Troubleshooting Guide

  • When I start Data Migration Manager, I see the message "Mandatory updates have not been installed. Data Migration Manager cannot continue.
  • Can't migrate data for entities (like business unit) which have mandatory self reference
  • How do I enable creating logs that include more details about migration errors?

References:

ReadMe:

http://download.microsoft.com/download/e/f/b/efb71b2d-5b70-470a- 94a10260e460e437/Microsoft_Dynamics_CRM_4.0_Data_Migration_Manager_Readme.htm

Troubleshooting Guide: http://rc.crm.dynamics.com/rc/regcont/en_us/OP/articles/migrationtroubleshooting.aspx

Cheers,

Sonal Sawhney

CRM Accelerators – Part V – Sales Methodologies Accelerator

Sales Methodologies Accelerator – Due to be released in Q3 2008


The sales methodologies accelerator is for customers who work with or want to work with one of the leading sales methodology vendors, including: Target Account Selling (TAS), SPI Solution Selling and Miller Heiman. Each of these vendors has their own distinct software products which need to be integrated with Microsoft Dynamics CRM to provide a holistic approach to sales management.

The sales methodologies accelerator will provide customizations, advice and guidance for customers on how these sales methodologies can be configured, managed and integrated with Microsoft Dynamics CRM 4.0.

Kind regards,

Reuben Krippner

Business Data Catalogue (BDC) and Microsoft Dynamics CRM

Manbhawan Prasad Published Tuesday, October 07, 2008 9:10

Business Data Catalogue (BDC) is one of the very exciting features of SharePoint. It got introduced in SharePoint 2007. This gives a declarative way to connect SharePoint to business applications such as standard ERPs, CRM, home grown apps etc.

It supports direct DB access as well as access through Web services. You can find good details at MSDN site (http://msdn.microsoft.com/en-us/library/ms551230.aspx). Ramping up to know BDC might be little work initially. One of the good way to get started is playing with samples provided at http://www.microsoft.com/downloads/details.aspx?familyid=6d94e307-67d9-41ac-b2d6-0074d6286fa9&displaylang=en.

If you decide to use BDC for your organization. You can follow the guidelines below.

1. Prepare BDC metadata (XML) which contains following information about back end system in a defined format:

a. Connectivity details

b. Entity details

c. Details around different entity methods/APIs with arguments & return types

2. You can use the tool developed by SharePoint to write these definitions. More details are at http://blogs.msdn.com/sharepoint/archive/2007/08/22/announcing-the-microsoft-business-data-catalog-definition-editor-for-microsoft-office-sharepoint-server-2007.aspx.

3. Import the BDC metadata file into SharePoint

a. You can find the place to do this at Shared Services in SharePoint central administration

b. It might take few cycles to get all the errors corrected here

And you are done with letting SharePoint know the details which it will use to connect to your LOB (line of business)/back end application. Many capabilities can be surfaced using BDC. You can get more details at http://blogs.msdn.com/sharepoint/archive/2006/04/18/business-data-catalog.aspx.

IMHO, one of the biggest advantage BDC provides is a great help in surfacing a common platform for the users, where they are dealing with different applications/data sources. For example lets assume that an organization uses MSCRM to deal with a set of data, and other App to deal with some other set of data. Many a times data stored in these two different App ends up having relationships with each other. In this case, it becomes very painful for user to switch applications for getting their daily work done. BDC comes here for rescue and provides a common platform with some nice out of box Web parts using that one can perform activities right in SharePoint without changing the context & applications.

In addition to this, it provides a strong programming environment through its object model. Using this one can very easily build applications which talks to different LOB apps. Guess what, developer does not need to learn different back end applications APIs & schemas. They just need to know BDC object model, and they can easily develop common platform talking to disparate systems.

If prudently used, it can prove to be a magic for end users, where they will love navigating to different types of records from different back end systems (related & unrelated) so smoothly as if data is coming from one source system. Enterprise search using BDC becomes even more stronger here, as searching in data coming from disparate systems become so cheap to own.

Because of the reasons above, importance of building BDC metadata for MSCRM cannot be undermined. And so there have been various efforts toward this. Some important ones are as follows:

MSCRM V3http://blogs.microsoft.co.il/files/folders/9188/download.aspx. This is a link to the BDC metadata file which can be a good starting point for building your own BDC metadata for MSCRM. Pls note that this might not work ‘as is’ for your implementation of MSCRM.

MSCRM V4.0http://blogs.msdn.com/crm/archive/2008/08/26/crm-accelerators-part-iv-enterprise-search-accelerator.aspx

While implementing BDC for the organization, one more thing we should keep in our mind is to define a good process around getting BDC metadata definitions changed if source system goes thorough some customizations like new attribute creations, relationship creations etc.

Some good resources around BDC can be found at:

  1. http://channel9.msdn.com/shows/In+the+Office/Introducing-the-Business-Data-Catalog-BDC/
  2. http://office.microsoft.com/en-us/sharepointserver/HA102200501033.aspx
  3. http://blogs.msdn.com/martinkearn/archive/2006/10/05/What-is-the-Business-Data-Catalouge_3F00_.aspx

Cheers,

Manbhawan Prasad

Microsoft Dynamics CRM 4.0 Step by Step

stepByStep

Microsoft Dynamics CRM 4.0 Step by Step allows you to work at your own pace through the easy numbered steps, practice files on CD, helpful hints, and troubleshooting tips to master the fundamentals of working with the latest versions of Microsoft Dynamics CRM including CRM 4.0 and CRM Online. You will learn the specifics of tracking customer communications, how to use e-mail templates for mass communication, the ins and outs of reporting and data analysis, and other essential tasks. With Step by Step, you can take just the lessons you need or work from cover to cover. Either way, you drive the instruction, building and practicing the skills you need, just when you need them! Includes a companion CD with hands-on practice files.

This book lets you take just the lessons you need or work from cover to cover you set the pace Covers all the fundamentals, from tracking customer communication to reporting and data analysis Features easy-to-follow lessons and hands-on skill-building exercises Includes a companion CD with skill-building exercises.

Mike Snyder is recognized as an industry-leading CRM expert. He is cofounder and principal of Sonoma Partners, a Chicago-based consulting firm that specializes in Microsoft CRM implementations. Microsoft twice awarded Sonoma Partners the Global Microsoft CRM Partner of the Year. Jim Steger has architected and led multiple award-winning Microsoft CRM deployments, including complex enterprise integration projects. He has been developing solutions and writing code for Microsoft CRM since the version 1.0 beta. He acted as a technical reviewer for several of the Microsoft CRM certification exams. Jim is also a cofounder and principal of Sonoma Partners.

Mike Snyder

CRM Accelerators – Part VIII – Business Productivity Accelerator

Business Productivity Accelerator
The business productivity accelerator is a toolkit of timesaving customizations and workflows for Dynamics CRM 4.0. Here is a subset of the components to be delivered:

Business data auditing workflows
Customizations to manage a customer reference program with Microsoft Dynamics CRM 4.0
Generic service process workflows for complaints management
Simple integration with web software services such as stock prices, mapping and web search
Birthday/Anniversary auto-email workflows
There are many other ideas under development for this accelerator and the final list will be communicated at release.

Kind regards,

Reuben Krippner

Published Wednesday, October

CRM Accelerators – Part VII – CRM Notifications Accelerator

CRM Notifications Accelerator

The CRM notifications accelerator allows users to subscribe to the CRM “business events” that are significant to them, e.g. a salesperson is interested in new leads and opportunities assigned to them whereas a customer service representative is interested in new service cases assigned to them. Once the user has subscribed (each user manages their subscription profile) to the types of events that are important to them they can elect how they want these event notifications to be delivered. Notifications are delivered via a Really Simple Syndication (RSS) feed and can be consumed with many desktop tools including Microsoft Outlook 2007 or the standard news feed Windows Vista® gadget.

The CRM notifications accelerator further drives system and process efficiency for users by giving them visibility into the business events which are directly relevant to their role.

Kind regards,

Reuben Krippner

Internet Facing Deployment (IFD) Installation Basics

Shashi Ranjan Published Friday, September 19, 2008 10:14

Hi, I recently started owning the Internet Facing Deployment (IFD) feature set within the CRM team and so in this blog I would share some of my thoughts and insights. I have also noted that there are no IFD related blogs on CRM so far, so this should be a good start. J

clip_image002IFD allows the customers to configure their CRM system to be reachable from outside the intranet (i.e. internet or outside of the firewall). The main difference when using IFD vs. typical on-premise deployment is how users are authenticated. When using the on-premise version, IIS handles most of the authentication via integrated windows authentication. There are however custom CRM authentication handler modules registered during setup to assist in the process. In IFD, the web site is opened for anonymous access and the authentication relies on the presence of the CRM ticket cookie. This cookie is obtained by starting off from a sign-in page.

Any page request that does not contain the CRM ticket cookie also gets redirected to it. This page deletes any expired CRM ticket cookies and if the user has provided correct credentials at the sign-in page, a new CRM ticket cookie would be written and used in later requests.

To install an IFD enabled CRM system, the install xml needs to have the following node in the crminstall.xml, under the path <CRMSetup><Server>. You will see that there isn’t too much information listed below, so the difference between an on-premise deployment with or without IFD should not be much. I will explain the difference later.

<ifdsettings enabled="true">

<internalnetworkaddress>157.55.160.202-255.255.255.255</internalnetworkaddress>

<rootdomainscheme>https</rootdomainscheme>

<sdkrootdomain>myDomain</sdkrootdomain>

<webapplicationrootdomain>myDomain</webapplicationrootdomain>

</ifdsettings>

Node details:

  1. The <ifdsettings> node is the root node containing all the details that the CRM server needs to enable IFD.
  2. The “enabled” attribute indicates if the IFD is to be enabled or not.
  3. The <internalnetworkaddress> is used to separate out internal vs. external request IP address. The internal requests will continue to go though the usual on-premise authentication, whereas external requests will go though the IFD. This is also known as SPLA authentication. SPLA stands for Service Provider Licensing Agreement. The value is specified in the IPAddress-IPAddressMask format and multiple values can be specified by separating them with a semicolon.
  4. The <sdkrootdomain> and the <webapplicationrootdomain> nodes are used to define the sdk and web application root domains. These should ideally be fully qualified domain names and can be same if sdk and application servers are located on the same box.

The installation of an IFD enabled CRM server is pretty simple but CRM setup does not provide any UI support for it so the IFD needs to be enabled via an xml install. Also once CRM is installed there is no easy out of box way (tools) to enable it. There is however a support tool that customers can download and use.

The support tool can be downloaded from:

http://www.microsoft.com/downloads/details.aspx?FamilyID=69089514-6e5a-47e1-928b-4e4d4a8541c0&DisplayLang=en

The documentation to use the tool and more on Microsoft Dynamics CRM 4.0 Internet Facing Deployment Scenarios can be found at:

http://www.microsoft.com/downloads/details.aspx?FamilyID=3861e56d-b5ed-4f7f-b2fd-5a53bc71dafc&DisplayLang=en

Ok. Now lets talk about difference between an on-premise and IFD enabled on-premise deployment. On the IFD enabled deployment you would see the following.

1) A new regkey 'IfdInternalNetworkAddress' with string value in format IPAddress-IPAddressMask. This contains the value of the node <internalnetworkaddress>

2) Three new DeploymentProperties in the config database (MSCRM_CONFIG).

     a. 'IfdRootDomainScheme',

     b. 'IfdSdkRootDomain',

     c. 'IfdWebApplicationRootDomain',

select * from DeploymentProperties where ColumnName like ‘Ifd%’

Each entry has the string (NVarCharColumn) value set to the values from the config xml. The Id field in the deployment property is the deployment id.

select id from Deployment

3) Updated <crm.authentication> node in the web.config file

<authentication strategy="OnPremise " />

-->

<authentication strategy="ServiceProviderLicenseAgreement" />

4) Anonymous access being enabled on the CRM website and all web pages under it.

5) The url used to access IFD is different from on-premise. It is in the form https://myOrganizaitonName.myDomain. Jagan Peri’s blog post covers this in more detail: http://blogs.msdn.com/crm/archive/2008/08/01/microsoft-dynamics-crm-urls.aspx

You should also checkout the latest version of SDK (version 4.0.6). It has some great examples of adding web pages in an IFD enabled CRM server and more.

The CRM SDK 4.0.6 can be downloaded from: http://www.microsoft.com/downloads/details.aspx?FamilyID=82E632A7-FAF9-41E0-8EC1-A2662AAE9DFB&displaylang=en

Cheers,

CRM Accelerators – Part VI – Extended Sales Forecasting Accelerator

The extended sales forecasting accelerator enriches the out-of-the-box sales forecasting capabilities in Microsoft Dynamics CRM 4.0. Principally, sales users of Microsoft Dynamics CRM can review their individual sales pipeline and quickly classify opportunities as committed, excluded or upside. Sales managers can monitor and track sales targets, budgets and performance against these forecasts for specific time periods (e.g. months and quarters).

Underpinning these capabilities will be new Microsoft Dynamics CRM 4.0 reports that summarize sales performance for the organization, business unit, team or individual salespeople.

Kind regards,

Reuben Krippner

Report Wizard: Query execution failed for data set 'DSMain'

Posted by David Jennaway at Wednesday, 15 October 2008

There is a problem with the CRM 4.0 Report Wizard that can result in an error like the following:
An error has occurred during report processing.Query execution failed for data set 'DSMain'.The column 'accountid' was specified multiple times for 'account0'. The column 'accountid' was specified multiple times for 'opportunity1'.

Explanation of the problem
The ultimate cause is how the Report Wizard stores the Filtering Criteria for reports based on the account entity. The Report Wizard stores the query for any criteria as a combination of all fields in the account entity, and all fields in the related primary contact. When the report is run, the SQL query attempts to use the results of the following (or similar) as a table alias:

select DISTINCT account0.*, accountprimarycontactidcontactcontactid.* from FilteredAccount as account0 left outer join FilteredContact as accountprimarycontactidcontactcontactid on (account0.primarycontactid = accountprimarycontactidcontactcontactid.contactid) where (account0.statecode = 0)

This returns two fields called accountid (one from the account entity, and one from the contact), which breaks the main SQL query for the report, and gives the error above.

Resolution
The way to resolve this is to ensure that, when you create the report with the Report Wizard, you do not specify any criteria for the account entity. This will cause the Report Wizard to store the query as solely against the account entity. Once you’ve created the report, you can happily edit the default filter to whatever you want, and the report will work fine – the key factor is not having any criteria when you first create the report.

Unfortunately there’s not an easy way to fix existing reports with this problem – it should be possible to edit the data in the DefaultFilter column in the reportbase table, but this is unsupported. I’d suggest in this scenario that you’re best off recreating the report from scratch

SQL Server: The instance name must be the same as computer name

Posted by David Jennaway on Saturday, 11 October 2008

This is something I’ve posted about on newsgroups, but one of my colleagues encountered it recently, and I think it deserves a blog entry.


The CRM Environment Diagnostics Wizard may throw the error ‘The instance name must be the same as computer name’. The most common cause of this is if the SQL Server has been renamed after SQL Server was installed. The reason is that, at installation time, SQL Server stores the computer name in a system table, sysservers. This information is not updated when the computer is renamed, and the error from the CRM Environment Diagnostics Wizard indicates that the entry in sysservers does not match the current computer name.


You can diagnose and resolve this by using some SQL system stored procedures. One of them lists the data in sysservers, the other 2 allow you to modify the data to reflect the current machine name.


To check if this is the issue, use SQL Management Studio (or Query Analyzer for SQL 2000) to execute the following query:
sp_helpserver
This will return output like the following:
Name,network_name,status,id,collation_name,connect_timeout,query_timeout
ORGNAME,ORIGNAME,rpc,rpc out,use remote collation,0,null,0,0


If the value in the name column does not match the current computer name, then you have to use the following SQL stored procedures to fix the problem. Note that sp_helpserver normally returns one record, but can return more records if you have configured linked servers. If this is the case, it is the row with id=0 that matters.


To change the information you have to first remove the incorrect record, then add the correct one, with the following queries:
sp_dropserver ‘ORIGNAME’ -- where ORIGNAME is the name returned by sp_helpserver
sp_addserver ‘CURRENTNAME’, ‘LOCAL’ – where CURRENTNAME is the current computer name



If you use named instances, refer to them in the form SERVERNAME\INSTANCENAME. It may then be necessary to restart SQL Server after these changes, but I'm not sure of this. It can't harm though if you can.



There is a KB article about this here. This descibes a similar solution, but be warned of a couple of minor issues with the solution - it fails to specify that quotes are required around the parameters to sp_dropserver and sp_addserver, and I have a feeling (though can't provide concrete evidence) that running sp_helpserver is more reliable than select @@servername.

Microsoft CRM 4.0.7 SDK Update Released

Another great update to the Microsoft CRM 4.0 SDK. This update was actually released on October 1st, 2008:

http://www.microsoft.com/downloads/details.aspx?FamilyID=82E632A7-FAF9-41E0-8EC1-A2662AAE9DFB&displaylang=en

Tuesday, October 7, 2008

Microsoft BizTalk Server 2006 Adapter for Microsoft Dynamics CRM 4.0

The Microsoft® BizTalk® Server 2006 Adapter for Microsoft Dynamics® CRM 4.0 enables integration between Microsoft Dynamics CRM and non-Microsoft business applications.

Microsoft® BizTalk® Server 2006 Adapter for Microsoft Dynamics® CRM 4.0 enables integration between Microsoft Dynamics CRM and non-Microsoft business applications. This lets you do the following operations:



Use Microsoft Dynamics CRM as a send adapter.

Discover and use the schema of any Microsoft Dynamics CRM actions and entities.

With the adapter you can integrate Microsoft Dynamics CRM with any other non-Microsoft business applications using the BizTalk Server 2006 mapping capabilities.

SQL 2008 and MSCRM 4.0

Friday, October 03, 2008 David Fronk Dynamic Methods Inc.


We successfully completed our internal upgrade to SQL 2008 and it is awesome. We did a complete revamp of our system and moved our CRM server and SQL server to a virtualized 64bit server then upgraded SQL to 2008 and it is great. The speed is fantastic and the move and upgrade were easy. The SQL 2008 upgrade was very easy and we're reaping the benefits. The enhancements to reports alone are worth it from a CRM perspective.

Check out some of the screenshots but if you are familiar with the slick looking reports that Excel 2007 has then you will be familiar with what SQL 2008 now has. There are gauges that Excel doesn't have (to my knowledge) that SQL 2008 has so that's one welcomed addtion.

Also, you can check out the Customer Effective Blog about this as Joel Lindstrom posted on this before SQL 2008 came out. My guess is that Joel has upgraded his system as well but you never know. We did and we're fans.

Here are some screenshots:





David Fronk
Dynamic Methods Inc.

SQL Timeouts in CRM - Generic SQL Error

David Jennaway - Microsoft Dynamics CRM


I often find myself answering forum posts about SQL errors (the most common error is 'Generic SQL Error'). By far the most likely cause of this error is a timeout when accessing SQL server. If this is a case your options are to increase the timeout, or to try and ensure the query does not take so long.

Increase the Timeout
The SQL timeout is controlled by a registry value on each CRM server:

HKEY_LOCAL_MACHINE\SOFTWARE\Microsoft\MSCRM\OLEDBTimeout

This value does not exist by default, and if it does not exist then the default value will be 30 seconds. To change it, add the registry value (of type DWORD), and put in the value you want (measured in seconds). Then you have to recycle the CrmAppPool application pool for the change to take effect (this is a step most Microsoft documentation omits to mention); do this we IISReset, or less drastically via iisapp.vbs

Reduce the time taken by the query
This may not be so simple, as you may have little control over the query. If the query is run as a result of your code (e.g. through a RetrieveMultiple request), then you may be able to make useful changes. For example, RetrieveMultiple requests on activities are not necessarily processed very efficiently by CRM (the problem is the way that is accesses the activity parties), and I've been able to make significant improvements by using a FetchXml query instead, which gives closer control over the joins used.

Otherwise, the other query optimisation option is to add indexes in SQL Server. This is a massive topic in its own right (I used to deliver 5 day training courses just on this topic), so I'm not going to go into detail here. The general steps are:
  1. Identify the SQL query that takes the time - I use CRM tracing for this - http://support.microsoft.com/kb/907490
  2. Use the SQL Management Studio and SQL Profiler to identify the query execution plan and to get recommendations about possible indexes

There are 2 important things to take into account:

  1. It is unsupported to add SQL indexes to the MSCRM database. My view on this is that, as long as the index does not implement any constraints (i.e. it's not a UNIQUE index) then you will affect the stability of CRM; you may however need to drop the index prior to upgrading CRM.
  2. Although adding an index may improve the performance of one query, it can adversely affect other SQL operations - most obviously data updates. There is no easy solution to this, though the SQL Profiler can help you if you capture and analyse a representative sample of SQL operations

CRM SDK Update - 4.0.6

There's been a new release of the CRM 4.0 SDK - available here http://www.microsoft.com/downloads/details.aspx?FamilyId=82E632A7-FAF9-41E0-8EC1-A2662AAE9DFB&displaylang=en

There are no major changes in this release, but there are a few aspects of note:
  1. The Plug-in example code now uses the DynamicEntity class as recommended
  2. The code for use of ImportXml, ExportXml and PublishXml looks like it now passes all required XML nodes
  3. There are now instructions for setting up a web reference in Visual Studio 2008
  4. The PrependOrgName client-side function is now documented (and hence supported)

Plug-ins - differences between Target and Image Entity

Posted by David Jennaway


In a plug-in there are potentially several ways to access entity data relevant to the plug-in action. For example, on the create message you can access the data on the new entity instance in one of the following ways:
  1. Via the Target InputParameter
  2. Via an Image Entity registered on the step
  3. Via a Retrieve request in the plug-in code

These do not always work in the same way, as follows:

Availability of the data by stage

The general rules are:

  1. InputParameter is available in all stages. It can be modified in the pre-stage, but changing it in the post-stage will have no effect
  2. A PostImage Entity is available in the post-stage, and a PreImage Entity in the pre-stage only
  3. If using a Retrieve in the plug-in, then the data returned depends on the stage. In the pre-stage, you will see the data before the modification, whereas in the post-stage you see the data after the modification
  4. Some Image Entities are not relevant for some messages - e.g. there is no PreImage for a Create message, and no PostImage for a Delete message

Data in the Name attribute

If the message is updating CRM (e.g. a Create or Update message) then the InputParameter only contains the minimum information that needs to be saved to CRM. A consequence of this is that the name attribute of any of the following data types is null:

  • Lookup
  • Owner
  • Customer
  • Picklist
  • Boolean

So, if your code needs to access the name, then you cannot rely on the InputParameter, and have to use either the Image Entity or a Retrieve to get the data.

My preference is to use an Image Entity, mostly as this reduces the code I have to write. The CRM SDK also suggests that this is more efficient, though I've not done any thorough performance testing on this to determine if this is relevant.

Blog Move: Speed Racer - Call CRM at speeds that would impress even Trixie

posted at: 9:37 AM by Aaron Elder


It's been a year since Invoke Systems and Ascentium merged and we finally took down the old clunk server that was hosting the old Invoke Systems blog.  This blog is of course still getting lots of hits and due to popular demand, I am going to migrate a few choice posts on our current blog.  Note these posts are all related to Microsoft CRM 3.0.  Here is the first.

The other day I was doing a bit of testing to find the fastest way to make rapid calls to CRM Web Services.  The truth of the matter is that when you are calling CRM Web Services you are pretty much down to the metal; so the only places I found to make optimizations were at the .NET Web Service Request level and at the server level.  The good news is that with a few very easy tweaks you can improve rapid CRM calls by almost 50%!

The results are based on a test that involved performing 250 Account create operations in a single-threaded clean CRM 3.0 RTM environment.  Please note that this article is about improving performance for operations such as data imports and bulk operations.  The same user and CRM Service are re-used for all 250 calls.  Be warned that not all settings are "safe" to use in all scenarios... please read up on the suggestions prior to implementing them in a production system.

The results of this study are as follows.

Test

Results

Raw Dog

15402.1472

PreAuthenticate

14450.7792

PreAuthenticate & Unsafe

12638.1728

Just Unsafe

9633.8528

Unsafe + IIS Tweaks

8862.744

[IMAGE MISSING]

So what does all this mean?  The answers are below.  For simplicity of the code samples, I assume you have already declared and setup a CrmService, the credentials are set and the URL is configured.  I also assume there is a CRM Account called "acc" that is ready to go.  Something like this:

CrmService crm = new CrmService();
crm.Credentials = System.Net.CredentialCache.DefaultCredentials;
crm.Url = "
http://localhost/MSCRMServices/2006/CrmService.asmx";

account acc = new account();
acc.name = "Test";

"Raw Dog" - This is the most straightforward and basic way of calling the CRM service, the code looks something like this:

crm.Create(acc);

PreAuthenticate - This is the first optimization that people seem to use and it does indeed provide a small benefit (~7%) over the default settings.  The code looks like this:

crm.PreAuthenticate = true;
crm.Create(acc);

So why does this work?  Reading the documentation suggests that this simply saves a round-trip required by NTLM's challenge-response authentication system.  MSDN: "With the exception of the first request, the PreAuthenticate property indicates whether to send authentication information with subsequent requests without waiting to be challenged by the server. When PreAuthenticate is false, the WebRequest waits for an authentication challenge before sending authentication information." - Of course since most systems have "Keep Alives" enabled and my scenario is using the same connection over and over, the savings are minimal.

PreAuthenticate & Unsafe - This attempt adds the "UnsafeAuthenticatedConnectionSharing" option to the mix and we get yet another boost in performance (~12% over our last test and ~18% for our first test).  The code looks like this:

crm.PreAuthenticate = true;
crm.UnsafeAuthenticatedConnectionSharing = true;
crm.Create(acc);

So why does this help?  When used in conjunction with "Keep Alives" this option keeps an authenticated connection open to the server.  MSDN: "The default value for this property is false, which causes the current connection to be closed after a request is completed. Your application must go through the authentication sequence every time it issues a new request.  If this property is set to true, the connection used to retrieve the response remains open after the authentication has been performed. In this case, other requests that have this property set to true may use the connection without re-authenticating. In other words, if a connection has been authenticated for user A, user B may reuse A's connection; user B's request is fulfilled based on the credentials of user A."

This option is very powerful and before using it be sure to read up on it here.  Since the connection is authenticated and shared, you need to make sure that two different users don't come in on the same connection.  If they do, the server will thing the user is the first user that opened the connection and allow the 2nd user to do whatever the 1st user could and to do it as if they were the same person.  There is a way around this using the property "ConnectionGroupName" property.  For my scenario of a bulk import, the only user that will be using this connection is the migration user and it will be the same user from start to end, so we are ok.

Unsafe - Now something curious happens when we keep PreAuthenticate off, but leave UnsafeAuthenticatedConnectionSharing on.  This is where things get a bit odd, this is faster than have both options on; a full 38% faster than the original test as a matter of fact!  The code looks like this:

crm.UnsafeAuthenticatedConnectionSharing = true;
crm.Create(acc);

Why does this work?  Well the answer is I don't know and I have talked to people on the CRM team and the .NET team and nobody has a really good answer.  The good new is that it does, perhaps it is best to leave it at that.

Tweaks - Finally, if you follow all the recommended steps for configuring a high-performance ASP.NET application (disable logging, enable ISAPI caching, make sure ASP.Net is tweaked) you can get a final little nip of performance.  Basically do what this article says and we get another 9% bump in performance.

 

Disclaimer:
This posting is provided "AS IS" with no warranties, and confers no rights.