Saturday, 29 January 2011

How to create a certificate for Ironport Email Security Appliance

This is a procedure to generate a certificate that you can import to your Ironport Email Security appliance.

Here are the basic steps:

  • Generate a certificate request using OpenSSL for Windows. Change the command line to your liking. The important thing is to change the "" to the URL that you want to use to access your Ironport appliance.
openssl req -new -newkey rsa:2048 -nodes -out ironport_domain_com.csr -keyout ironport_domain_com.key -subj "/C=HR/ST=Grad Zagreb/L=Zagreb/O=Organization/OU=IT/"
  • Sign the request file (CSR) using Windows CA. You can use web application (https://servername/certsrv) of your Issuing CA and then paste the CSR there and use the Web Server template.
  • Convert the output CER file to PEM file  
openssl.exe x509 -in ironport.cer -inform der -out ironport.pem -outform pem
  • Generate a P12 file that includes private and public keys
openssl.exe pkcs12 -export -out ironport.p12 -in ironport.pem -inkey ironport_domain_com.key
  • Import the P12 file to your Ironport using the web GUI (Network > Certificate > Add Certificate)

Friday, 28 January 2011

Always send a read receipt in Outlook

I wrote a custom Group Policy Administrative Template that sets the "Always send a read receipt" in Outlook clients. This custom ADM template is useful if you want to set this property to all or subset of users in a domain but still want to give them the ability to control other properties in Tracking options in Outlook that would otherwise be disabled if you use Office administrative templates downloaded from Microsoft site.

Just copy/paste the following code in notepad and save the file with ADM extension. Import the file in Group Policy object under "User Configuration".


CATEGORY "Outlook 2003 Receipt Response"

POLICY "Always send response"
KEYNAME Software\Microsoft\Office\11.0\Outlook\Options\Mail
 PART "Always send response for read receipt requests" CHECKBOX
   VALUENAME "Receipt Response"


CATEGORY "Outlook 2007 Receipt Response"

POLICY "Always send response"
KEYNAME Software\Microsoft\Office\12.0\Outlook\Options\Mail
 PART "Always send response for read receipt requests" CHECKBOX
   VALUENAME "Receipt Response"


CATEGORY "Outlook 2010 Receipt Response"

POLICY "Always send response"
KEYNAME Software\Microsoft\Office\14.0\Outlook\Options\Mail
 PART "Always send response for read receipt requests" CHECKBOX
   VALUENAME "Receipt Response"


Wednesday, 26 January 2011

How to properly issue a certificate for Forefront TMG Standalone Arrays in a workgroup


Due to the problems and pain we have encountered in making Forefront TMG 2010 Standalone Array in a workgroup to work on VMware ESX 3.5 Update 5 I will detail the steps for creating and importing certificates to TMG certificates store and point out to the problems with TMG Control service dependencies.

This is the environment we had:
  • Two Forefront TMG 2010 Enterprise Servers in a workgroup configured in Standalone Array with one TMG configured as Array Manager and another configured as Array Member
  • Windows Server 2008 R2 Standard
  • Virtual machines on VMware ESX 3.5 Update 5

During the implementation we have experienced the problem with Forefront TMG Control service taking 10 minutes to start after a server restart. The service would eventually start but the other Forefront services that depend on it will fail to start.

The problem was solved implementing the following fixes:

I do not imply with this article that the problem with TMG Control service hang is related only to VMware ESX but we only experienced it on this platform. When we encountered the problem we did some tests on Hyper-V environment and on separate VMware ESX 3.5 Update 5 environment and there was no problem, however on this particular environment the service would not start once the TMG array was configured.

So I would recommend for anyone to get these dependencies fixed even if you do not encounter this problem now. Regarding the problem with certificates, I have already blogged about it here but I also wrote a procedure how to properly issue Server Authentication certificate for TMG arrays in a workgroup.

How to issue a proper "Server Authentication" certificate

  • Access to any Windows Server 2008 IIS 7.0 web server
  • Access to Enterprise or Standalone Windows Sever 2008 Certification Authority (Windows 2003 CA is also okay)

1. Open the IIS Manager, click on server name node from the left pane and click on "Server Certificates" from the middle pane

2. Click on the "Create Certificate Request" from the right pane

3. In the "Common name" field type the FQDN of the TMG server that will act as an Array Manager. In this example we will use "". Fill the remaining fields so that you best describe your organization.

4. Choose "Microsoft RSA SChannell Cryptographic Provider" for the "Cryptographic service provider" and choose 2048 for the "Bit lenght".

5. Save the certificate request as C:\tmg01.req.

6. Navigate to the Issuing or Root CA web site such as https://yourservername/certsrv and click on "Request a certificate"

7. Click on "advanced certificate request"

8. Click on "Submit a certificate request by using  a base-64-encoded CMC or PKCS #10 file, or submit a renewall request  by using a base-64-encoded PKCMS #7 file".

9. Paste the contents of the tmg01.req file that you have created earlier from IIS to the "Base-64-encoded certificate request" field. In case you have a drop-box with Certificate Templates list, select "Web Server" template.

10. Your certificate request is now submitted to the CA. In case the "Request Handling" property of your CA is set to automatically issue certificates you will be presented with the following page where you have the possibility to download your issued "cer" file. Click on "Download certificate" and save the file as C:\tmg01.cer. Go to the step number 15.

In case the "Request Handling" is set to manually issue the certificates by the administrator then you will have to perform the following steps.

11. Open the "Certification Authority" console on your Issuing CA server and click on "Pending Requests". You should see your request in the right pane.

12. Right click on the request and select All Tasks > Issue.

13. Browse to the CA web site again (https://yourservername/certsrv) and click "View the status of the pending certificate request". There should be your "Saved-Certificate Request" listed.

14. You are now presented with the same page as in step number 10. Download the "cer" file as described in step 10 and proceed to step 15.

15. Now return to the IIS Manager console from which you have created the certificate request and now select "Complete Certificate Request".

16. In the "Specify Certificate Authority Response" screen browse to the "cer" file you  have downloaded from the CA and enter a friendly name for the certificate. I usually type the same name as common name.

You have now completed the procedure of issuing the "Server Authentication" certificate. If you open the "Local Computer" Certificates store on the server where you have requested the certificate you should see the certificate in the Personal > Certificates folder. The certificate icon should have a little yellow key pictured which means that you have both private and public key. We must export the certificate with private and public keys so that we can import it on our TMG server.

17. Right click on the certificate and click All Tasks > Export.

18. Select "Yes, export the private key".

19. "Personal Information Exchange - PKCS #12 (.PFX)" should be selected. Unmark all the checkboxes and click Next.

20. Type the password that you will need to type when you import the certificate to the TMG computer.

21. Save the certificate as C:\tmg01.pfx.

Now that we have our certificate ready for import there is still one thing we must do. Since we are creating TMG array in a workgroup mode we must import the root certificate of the CA that issued the certificate to all of the TMG servers that will participate in array. But first we must export the root CA certificate from a computer that has it.

22. Open the "Local Computer" Certificates store on the Issuing CA computer or on some other computer which is a domain member in a domain where CA resides.

23. Navigate to the Trusted Root Certification Authorities > Certificates, right-click on the root certificate from the CA which issued your certificate and select All Tasks > Export.

24. Select "DER encoded binary X.509 (.CER)" and click Next.

25. Save the "cer" file to disk. In our example it is C:\CompanyRootCA.cer.

Now we have both the PFX file which contains our public and private keys for the TMG computer certificate and a CER file that contains a public key from our root CA. The next thing we must do is to import the root certificate to each TMG server that will participate in the array and to import the "Server Authentication" certificate.

Note: It is good practice to create "Server Authentication" certificate for all TMG servers so that if Array Manager fails you can promote some other Array Member to Array Manager.

26. Open the "Local Computer" Certificates store on each TMG server and import the root certificate "cer" file to the "Trusted Root Certification Authorities".

27. Now open the "Forefront TMG Management" console on the TMG server that will act as an Array Manager. Expand "Forefront TMG" in the left pane and click on System node. Click on the TMG server name in the center pane and click on the "Install Server Certificate" in the right pane.

28. Now browse to the "pfx" file you have exported from the web server computer and type a password for the file. Unmark the checkbox "Automatically create the root CA certificate on this array manager." To my experience leaving this checkbox marked always resulted in an error even though the pfx file contained the root CA certificate. Click OK.

Now if you open the Certificates store for the Windows service named ISASTGCTRL you should see the imported certificate with the private key in the Personal store.

So why is important to use Forefront TMG Management console to import the certificate? You could just import the certificate in the Local Computer Certificates store, right? Well the answer is yes and no. If you do it this way the ISASTGCTRL service will not have enough permissions to read private key file that is stored in the C:\ProgramData\Microsoft\Crypto\RSA\MachineKeys and you would get Schannel errors in the Windows event log and TMG Control service would not be able to communicate with the ADAM service (ISASTGCTRL) on the Array Manager computer. If you use Forefront TMG Management console to import the certificate all the necessary permissions are added to the private key file including:
  • fwsrv
Of course you could manually update the permissions on the private key file if you knew which private key it is and things would work but it is not the proper way to do this.

Test the connection

Now there is only thing left and that is to test the secure LDAP connection to the Array Manager server. We will use ldp.exe for this. You should be able to run it from your TMG servers.

Open ldp.exe and click on Connection > Connect. Type FQDN of your TMG server that will act as Array Manager and type 2172 for the port number as this is the port on which ISASTGCTRL service listens. Click on the SSL and click Connect.

If the connection is successful you will see the screen like the following:

And that is all there is to it! Make sure to complete the procedure for all TMG servers that will participate in the array for the already mentioned reason and that is so that another Array Member can become Array Manager in case the Array Manager fails.

Sunday, 23 January 2011

Forefront TMG and persistent cookies problem


I just want to point out to a problem I have just resolved. Customer reported a problem when accessing Sharepoint 2010 site that is published using Forefront TMG 2010. Sharepoint listener is configured to use forms-based authentication with Persistent Cookies enabled for both public and private computers. For those who wants to know how this works you can read my article here.

The problem was that when the Private computer was selected in the authentication form when accessing site it worked fine but when the Public or Shared computer was selected the user could not be authenticated. The TMG log showed the following error:

12302 - The server denied the specified Uniform Resource Locator (URL). Contact the server administrator.

When the listener was modified to use Persistent Cookies only for the Private computers then the authentication worked also for Public computers but then the desired authentication cookie was not saved to the local client cache. The problem appeared on IE, Firefox and on different client computers so it was not related to the local client cache or something like that. TMG and Sharepoint servers were restarted but it also did not help.

The problem was resolved only by recreating the Sharepoint listener on TMG using all the same properties as before. Now both the Public and Private computers work with Persistent Cookies enabled.


Wednesday, 19 January 2011

Demistifying "The Cloud"


For the last two days I was attending Private Cloud workshop in Microsoft office in Warsaw. The correct name of the workshop was actually Government Private Cloud Computing but the workshop actually gave great overview about types of "clouds" and the types of services that can be delivered through the cloud. It also covered every aspect of building a cloud solution from technology to process management, automation and billing. What strike me the most is actually how easily the specific cloud type can be defined. I am reading about the cloud infrastructure for a while now and I have even been architecting and implementing public cloud solutions based on Exchange 2007/2010 and Sharepoint 2007/2010 and the workshop helped me to sort in my head everything I have learned so far. The main goal of the workshop was to teach us how to have a conversation with a customer company that plans to go to "the cloud" and how to ask some basic questions that will help us define the cloud or the service that is to be offered through the cloud.

In the next couple of lines I will try to summarize the different service delivery types and the different types of cloud and map those to the actual technology or a scenario based on my personal experience.

Let us first cover the service delivery types:
  • Infrastructure as a Service (IaaS)
  • Platform as a Service (PaaS)
  • Software as a Service (SaaS)

Infrastructure as a Service

The blue boxes shows what Service Provider actually provides to the customer. We can see that those elements are Datacenter, Networking, Computers and Virtualization. In practice that would mean that the provider company takes care of system rooms, electricity and cooling (Datacenter), networking components such as switches and routers (Networking), servers that comprise the monitoring, provisioning, billing and virtualization infrastructure (Computers) and manages virtualization technology such as Hyper-V or VMware clusters. Everything above can be dynamically created, self-serviced and pooled by the customer. In this certain case it would mean that the customer can self-provision virtual machines with the operating system of choice, pool the amount of storage he needs for the virtual machine and install the applications he needs for his business. In a Microsoft world this functionality could be provided by System Center VMM Self-Service Portal 2.0.

Platform as a Service

Now let us move a few boxes upwards. If the operating system and the storage is controlled by a service provider company and the customer can deploy it's own applications that run on the underlying platform of choice then this model is called Platform as a Service or PaaS. Middleware box is grey area here. Let us consider a scenario where the customer gets presented by an empty virtual machine where he can deploy his own engine like Oracle Application Server and build his own applications on top of that. Then the middleware would be controlled by a customer. Or we could provide the customer with Microsoft Sharepoint 2010 site collection and give him the ability to deploy his own web parts or workflows. The customer would not have access to the operating system in this case. He would only have access to his own little isolated area on Sharepoint and anything he deploys there is relevant only to him and it does not affect other customers. Sharepoint 2010 has sandboxed solutions integrated so this would make for a perfect example of Platform as a Service.

Software as a Service

If the service provider company has the total control of the stack and the customer only consumes services such sending and receiving of e-mail, upload documents to sharepoint or scheduling and running voice conferences than this is called Software as a Service. Customer only sees the service he uses and has no control of any aspect of the underlying platform. Self-service in this case would mean that the customer administrator can provision a new mailbox for an employee inside his own company or could enable the user for voice conferencing so that he can share video and audio with his colleagues. Examples of these in a Microsoft world are of course Exchange, Sharepoint, OCS and CRM.

There are few other aspects of the service delivery models that must be met so that we could call our solution a "cloud solution". The workshop I have attended specified the following characteristics which I will try do describe in my own words:
  • Resource Pooling - the ability to share resources such as network, memory and processing power and provide the resources when needed to the workload that needs them
  • Measured Service - the ability to measure service utilization per customer such as number of mailboxes he used during the last month, the amount of GBytes he uses on the storage for mailbox, documents, databases etc. or bandwith that was consumed. The most important thing is that the customer can be charged only by what he spent the last month. If the number of mailboxes decreased or the bandwith utilization dropped from the previous month, the customer should be charged accordingly!
  • Broad Network Service - the service should be possible to be accessed from anywhere using any Internet connection
  • Rapid Elasticity - the service should be possible to scale out or scale up in the shortest amount of time. Consider adding new virtualization host to the infrastructure to increase processing power or adding new disk shelf to increase storage space or increase performance. The service should also be able to rapidly scale down if resources are not used. An example for this is moving running virtual machines to the smaller number of hosts and then shutting down hosts with no load.
  • On-Demand Self Service - the customer should be able to self-provision the service such as create a new mailbox, add additional disk space to the virtual machine or enable an employee for Sharepoint or OCS access

Now when we have defined the service delivery models and cloud service characteristics we will cover the cloud deployment models. There are four deployment models:

  • Private cloud - this cloud infrastructure is operated solely for a specific organization. It can be managed by an organization itself or by a service provider and may exist on premise or off premise. Owner of the infrastructure components can be either the customer or the service provider. I can think of a scenario in Microsoft Online Services and e-mail hosting. If you have large enough company with plenty of e-mail users, Microsoft would provide a dedicated infrastructure just for you and it would be managed and operated by Microsoft for your company only. Or there could be another reason for a private cloud infrastructure. An organization security policy could specify that data must somehow be  separated and isolated from other organizations which would require a dedicated e-mail server just for the specific organization or a dedicated hardware infrastructure.
  • Community cloud - the cloud infrastructure that is shared by several organizations and supports a specific community that has shared concerns such as mission, security requirements, policy and compliance considerations. A good example for this scenario are government ministries or agencies that could share the same common infrastructure.
    • Public cloud -  this cloud infrastructure is made for general public and owned and maintained by an organization (service provider) that sells cloud services. Examples would be e-mail services, shared or dedicated Sharepoint servers, virtual machines etc.
    • Hybrid cloud - composition of two or more models (private, community or public) that remain unique entities but are bound together by standardized or proprietary technology that enables data and application portability.


    So where does the technology fits in? Almost every major vendor out there today is delivering its own solutions that are cloud enabled or will help you to get to the cloud. Microsoft naturally has its own set of solutions that I will try to cover briefly here.

    • Hardware - hardware selection is totally up to you. You can choose server, storage or network vendor you like or already work with
    • Virtualization - there are at least three out there that I would recommend, Hyper-V, VMware and XenServer, they all do their jobs excellent
    • Deployment - System Center Configuration Manager is a perfect tool for deploying new operating systems, either provisioning new desktops for the customer or new Hyper-V hosts for the service provider or patching the existing infrastructure
    • Monitoring - System Center Operations Manager - used for monitoring the entire environment and alerting in case something goes wrong or you just want to be notified about. It can also be used as a billing tool to show your customer how much "resources" have been used for the past month.
    • Automation - Microsoft has a new member of the System Center family and it is called Opalis. It is used for Datacenter automation. It enables you to automate certain Datacenter processes and helps you get "elasticity" for your cloud infrastructure. One example of automation could be installing new OS on the server, joining it to virtualization cluster, deploying new patches and updates and deploying SCOM agent for monitoring. You can do that with this tool. Microsoft also provides already mentioned System Center Virtual Machine Manager Self-Service Portal 2.0. It can be used for automating virtual machine deployment and it is also customizable in a way it can be further developed or branded for each customer. 
    • Provisioning - provisioning application in the cloud infrastructure is used in two different ways. If you work for service provider company then you would probably use it to quickly provision new customer and give him access to your services. If you are customer, you will probably use the provisioning application (usually a web interface) to quickly create new virtual machines or create new mailboxes for your employees. Microsoft has no unified software that would deal with all scenarios but here are some examples. Exchange 2010 with SP1 has multi-tenancy feature integrated and it provides the so-called Exchange Control Panel or ECP. If you are a customer that is provisioned on a shared or hosted Exchange 2010 server you would probably get access to ECP and have a possibility to manage mailboxes for your employees only. Other customers on that same Exchange server are completely isolated from your users. However, the provisioning application for the service provider company in regard to Exchange 2010 server does not really exist. There is only a powershell interface you can use or develop your own. But luckily there are companies out there who already have a software for that purpose.

    Well I hope that I have helped you to understand some of the basics of the cloud infrastructure and that the cloud paradigm is not so cloudy to you anymore :)

    Please feel free to comment on this post and give your view about the terms and definitions about "the cloud". I have to be honest and say that I am by no means an expert on this field because the topic is so huge and it would probably take years of experience to understand it completely but I can definitely say that the workshop I have attended helped me to get some basic understanding.


      Friday, 14 January 2011

      Forefront TMG Enterprise Standalone Array does not start after server reboot

      Last couple of days me and my colleague where troubleshooting a brand new installation of Forefront TMG Enterprise Standalone Array consisting of two nodes. The problem we had was that after a server restart, Forefront TMG Control service would not start. It would hang in a starting state for about 10 minutes after which it would eventually start but the TMG Firewall service and all other TMG services that depend on TMG Control service did not start because of this timeout. After that we could manually start the services and the TMG Array worked with no problem. It was only the problem after server reboot.

      The environment:

      • Two Windows Server 2008 R2 Standard virtual machines on VMware ESX 3.5 Update 5 environment
      • Forefront TMG 2010 Enterprise SP1 Software Update 1
      • Forefront Standalone Array in Workgroup mode with one node designated as Array Manager and the other one as Array Member
      • Each node had a server certificate installed in local computer store with Extended Key Usages for Server Authentication and Client Authentication

      Here are some of the errors we were seeing in the event log:

      The Microsoft Forefront TMG Control service hung on starting. 

      The Microsoft Forefront TMG Firewall service depends on the Microsoft Forefront TMG Control service which failed to start because of the following error:
      After starting, the service hung in a start-pending state.

      The Microsoft Forefront TMG Managed Control service depends on the Microsoft Forefront TMG Control service which failed to start because of the following error:
      After starting, the service hung in a start-pending state.

      The Microsoft Forefront TMG Job Scheduler service depends on the Microsoft Forefront TMG Control service which failed to start because of the following error:
      After starting, the service hung in a start-pending state.

      Log Name:      Application
      Source:        Windows Error Reporting
      Date:          14.1.2011. 14:57:24
      Event ID:      1001
      Task Category: None
      Level:         Information
      Keywords:      Classic
      User:          N/A
      Computer:      *************
      Fault bucket , type 0
      Event Name: ServiceHang
      Response: Not available
      Cab Id: 0

      Problem signature:
      P1: isactrl
      P2: mspadmin.exe"
      P4: 10
      P5: 2

      Attached files:

      These files may be available here:

      Analysis symbol:
      Rechecking for solution: 0
      Report Id: 363731e9-1fe6-11e0-846f-00155d274102
      Report Status: 0

      To better understand the problem here are some technical details.

      TMG members in the Standalone array communicate with the array manager which has AD LDS (Active Directore Lightwight Directory Services) installed which provides configuration storage for the entire array. The array manager would first save the configuration to its local AD LDS instance and the rest of the array members connect to it using Secure LDAP which require server certificate with "Server Authentication" key usage. Actually, only the Array Manager requires the certificate and the Array Members require root certificate from the Certification Authority that signed the certificate located in Trusted Root Certification Authorities Store so that they would trust Array Managers certificate. But in case the Array Manager fails you would have to manually promote one of the Array Members to Array Manager and he would then require server certificate installed. So we have installed server certificate on both TMG computers.

      Here comes the problem.... Since Array Member had its own server certificate with Extended Key Usage and Intended Purposes set to Server Authentication and Client Authentication, when authenticating to remote AD LDS service it would present its client certificate and this process is known as mutual authentication or MTLS. Well it seems that TMG Control service does not like this behavior and it times out for about 10-15 minute after which none of the TMG services start. The problem happened on both TMG computers even though one TMG was Array Manager, he still needed to connect to local AD LDS instance but he too tried to mutually authenticate to the local AD LDS service.

      Well, the solution was quite simple in fact, but very hard and frustrating to find because there were almost no relevant logs to look at. On each TMG computer the certificate properties should be modified to include only Server Authentication for Intended Purpose.

      Here is how to do it:
      • Open Certificates MMC snap-in and connect to Local Computer
      • Navigate to Personal > Certificates > your_computer_certificate (the certificate should have common name of FQDN of your TMG computer)
      • Double click on the certificate, click on the Details tab and click Edit Properties
      • Choose "Enable only the following purposes" radio button and check Server Authentication
      • Restart your computer and see if TMG services start normally

      Of course, if your certificate only has Server Authentication in Extended Key Usage field then you will not experience this issue.

      Microsoft also had something to say about this issue in the following article, but not directly related to this problem.

      Client logon is slow and server certificates used for Web publishing are configured with the default purpose settings "Server Authentication" and "Client Authentication"
      Issue: When Windows Server 2003 detects the default purpose setting of "Client Authentication", the operating system attempts to perform TLS with mutual authentication to the domain controller. The mutual authentication process requires ISA Server to have access to the private key of the server certificate with the "Client Authentication" setting enabled, and ISA Server does not (and should not) have this access.
      Solution: Ensure that all server certificates do not have the default "Client Authentication" purpose enabled. You can disable this setting on the property pages of the relevant server certificate as follows:
      Disable Client Authentication purpose on a certificate
      1.     Open the Certificates Microsoft Management Console (mmc) snap-in. To add the Certificate Manager to the mmc, do the following:
      ·         Click Start, and then click Run.
      ·         Type mmc and then press ENTER.
      ·         Select the File menu, and then select Add/Remove Snap-in.
      ·         In the Add/Remove Snap-in box, and then click Add.
      ·         Double-click the Certificates snap-in, select Computer Account, and then click Finish.
      ·         Select Local Computer, and then click Finish.
      ·         Close the dialog boxes.
      2.     In the Certificates mmc, click to expand the Certificates node, and then expand Personal.
      3.     Right-click the relevant certificate and then click Properties.
      4.     On the Details tab, click Edit Properties.
      5.     Select Enable only the following purposes, and clear the Client Authentication purpose.

      Link to the entire article here.

      While troubleshooting this issue, out of pure frustration we even replicated the entire environment on Windows Server 2008 SP2 and later even on Hyper-V as the virtualization platform to eliminate any compatibility issues but finally it seems that this little setting did the trick.

      We have also tried, read this carefully, Rollup 1 and Rollup 2 for Software Update 1 for Service Pack 1 for TMG 2010 just to be sure we had the entire environment patched and read numerous blogs that talked about TMG Control service dependency issues that would arise after installation of TMG updates and rollups but none of those worked.

      I really hope this article will someday save a lot of time to someone :)

      Here is a link to a blog article that describes some other startup issues that you may have with TMG related to service dependency ordering.

      Wednesday, 12 January 2011

      How to customize Sharepoint E-Mail-Enabled document library using SPEmailEventReceiver

      I guess it is not new to everyone that Sharepoint can receive e-mails in a document library. This feature from its inception was meant to finally replace Exchange Public Folders. But Exchange 2010 still has Public Folders even though Sharepoint has been able to receive e-mails since 2003 version! One of the reasons that E-Mail-Enabled document libraries in Sharepoint are not still widely used must be that out-of-the-box feature is pretty much inflexible. You can do just the couple of things with it such as:
      • Save files that are attached to an e-mail in a document library
      • Save original e-mails as a separate eml item in a document library
      • Ability to choose weather you want to overwrite attached documents with the same name
      And that's about is.

      So, what if you are receiving e-mails all attached with the same file name but with different content? You would not be able to use this feature because you would have an option to overwrite or not to overwrite the file with the same name and you could not save all the attachments you receive. Sharepoint allows to group attachments in a separate folders based on e-mail sender address or e-mail subject. But what if you are receiving e-mail from the same sender with the same e-mail subject every time?

      Consider using a fax machine (yes, faxes are still a very popular thing) that sends an e-mail with an pdf document attached that represents fax message. You specify an e-mail address where this e-mail messages will be sent to, let's say it is your Sharepoint E-Mail Enabled document library. Fax machine hardly has an option to change e-mail subject with each e-mail that is sent and the sender address is always the same.

      Fortunately the fax machine I've encountered always attaches the pdf document with different name so we were able to store every pdf/fax in a document library that is received. But the message body also contains important information such as the number from which the fax was received, destination number and the receive date. If we choose the option only to save attachments to document library we would loose this information forever. If we choose the option to save original e-mails as a separate eml item in a document library we could only save one eml item because every e-mail has the same subject!. And of course it is not intuitive at all to open eml item just to see the message body.

      The solution to this kind of problems is to use SPEmailEventReceiver event handler object in Sharepoint 2010 and with a little bit of effort we could catch every e-mail that is destined to this document library and change the default behavior to whatever suits or scenario.

      By default, Sharepoint represents this GUI when we click on "Incoming e-mail settings" on the "Library Settings" page:


      But once we register our own event handler we only get something like this:

      This is because we can now only specify an e-mail address for the document library and we should do the rest of the customization with Visual Studio 2010.

      To create custom SPEmailEventReceiver you need to install Sharepoint 2010 and Visual Studio 2010 on the same Window Server 2008 R2 machine. Otherwise if you try to create a new project using Sharepoint 2010 templates you will get an error message like this:

      Now, let's say that our fax device is sending e-mails with always the same subject and an attachment with filename that is always different. The message body also contains "FROM=" line which specifies the sending fax number. Once the e-mail is picked up by Sharepoint from the local SMTP service drop folder and delivered to the document library our SPEmailEventReceiver will save the attachment in the document library and read a message body to extract the FROM field. The sending fax number will be saved in a custom field FaxNumber that we could later use to create views that group received faxes from a certain sender.

      I will now cover the basic steps that will help to create such an event receiver:
      • Create a Sharepoint site
      • Create a Document Library and enable it for e-mails - type only e-mail address, the rest of the properties are not important to us
      • Open Visual Studio 2010 and create a new project from Sharepoint 2010 template and select Event Receiver template
      • Enter the URL which points to the Sharepoint site you have created and select "Deploy as farm solution" option
      • Select "List Email Events" for the event receiver type and "Document Library" for the event source. Of course you can select any other event source that you need such as "Custom List" for example
      When you complete the steps you should be presented with the code template that you can now customize to your own need. Here is a copy/paste of the code that I have used:

      using System;
      using System.Security.Permissions;
      using Microsoft.SharePoint;
      using Microsoft.SharePoint.Security;
      using Microsoft.SharePoint.Utilities;
      using Microsoft.SharePoint.Workflow;
      using System.Runtime.InteropServices;
      namespace FaxReceiver.EventReceiver1
          /// <summary>
          /// List Email Events
          /// </summary>
          public class EventReceiver1 : SPEmailEventReceiver
             /// <summary>
             /// The list received an e-mail message.
             /// </summary>
             public override void EmailReceived(SPList list, SPEmailMessage emailMessage, String receiverData)
                 //base.EmailReceived(list, emailMessage, receiverData);
                 //oListItem["Name"] = emailMessage.Headers["Subject"];
                 //oListItem["Body"] = emailMessage.HtmlBody;
                 SPEmailAttachmentCollection attachColl = emailMessage.Attachments;
                 foreach (SPEmailAttachment attach in attachColl)
                     //Exception ex =  Marshal.GetExceptionForHR(-2130575257);
                         AddFileToDocLib(list, attach, attach.FileName,emailMessage.PlainTextBody);
                     catch (SPException ex)
                          if (ex.ErrorCode == -2130575257) //a file alredy exists
                              //append random number to the beginning of the file name
                              AddFileToDocLib(list, attach, "123456" + attach.FileName, emailMessage.PlainTextBody);
             private void AddFileToDocLib(SPList list, SPEmailAttachment attach, string filename, string messagePlainText)
                 SPFile file = list.RootFolder.Files.Add(list.RootFolder.ServerRelativeUrl + "/" + filename, attach.ContentStream);
                 ExtractMessageProperties(messagePlainText, file);
             private void ExtractMessageProperties(string messagePlainTextBody, SPFile file)
                 file.Properties["MessageBody"] = messagePlainTextBody;
                 string[] messageBody = messagePlainTextBody.Split("\n\r".ToCharArray(), StringSplitOptions.RemoveEmptyEntries);
                 foreach(string str in messageBody)
                     if (str.IndexOf("FROM", 0) != -1)
                         string strFaxNumber = str.Replace("FROM=","");
                         file.Properties["FaxNumber"] = strFaxNumber;

      For this code to work on your document library you need to create these custom fields:
      • FaxNumber (Single line of text) - used for storing sending fax numbers
      • MessageBody (Multiple lines of text) - used for storing the complete message body for later reference
      Now you should build and deploy your solution using the Build menu in Visual Studio. Your feature should be automatically activated on the site collection. You can check that by going to the root site of the site collection and selecting Site Settings > Manage site features. You should see something like this:

      Now you can send an e-mail with attachment and make sure the message body contains the "FROM=" line with the fax number next to it such as "FROM=23456789".

      Debugging SPEmailEventReceiver

      Of course, you will ran into problems sooner or later when playing with event receivers so it is useful to know how to debug them using Visual Studio. MSDN documentation here describes how you can debug event receivers. This procedure works well for perhaps all event receivers except SPEmailEventReceiver so you will need to use other technique. The problem with the SPEmailEventReceiver is that it does not run in w3svc.exe process like Visual Studio assumes. In fact, SPEmailEventReceiver is run in owstimer.exe process which is actually "Sharepoint 2010 Timer" service. Unlike the rest of the event handlers which are triggered when the item is added, updated or deleted in a document library, SPEmailEventReceiver is triggered by Sharepoint Timer service at the time when an e-mail is picked up from the SMTP drop folder on a local Sharepoint server. This means that if we want to use breakpoints in our SPEmailEventReceiver we need to attach the debugger to OWSTIMER.EXE process and only then will the breakpoints be hit. You are not required to set "Active Deployment Configuration" to "No Activation" as stated in MSDN article but instead each time you deploy a solution you need to restart "Sharepoint 2010 Timer" service for the new dll to be loaded.

      One additional peace of information that is available when debugging SPEmailEventReceiver is by checking the following Windows Event Logs - Applications and Services Log > Microsoft > Sharepoint Products > Shared > Operational. Here you will see that Sharepoint 2010 Timer service processes e-mails from the SMTP drop folder every minute. If if fails to process e-mails for some reason the error message will be logged here.

      I will not describe here how to set up SMTP service on a Sharepoint server to receive e-mails because it is already well described on many blogs. But to make things easier for you I will tell you that I use Windows Live Mail client that is installed on my development server and I've set up the profile that points to the local SMTP server. When I send an e-mail I send it to something like shared.documents@vssp2010.local where "shared.documents" is what I typed in the document library incoming e-mail settings and vssp2010 is the name of the development server. Configure the SMTP service to accept e-emails for vssp2010.local domain and you can now easily generate e-mails for your Sharepoint.

      I wish you happy customizing!

      Monday, 10 January 2011

      Multiple authentication prompts when opening documents from WSS/MOSS/Sharepoint


      This one was bugging me for a long time but I never really had time to solve it for myself until I had to solve it for a customer, how typical for me :)

      Consider a following scenario:
      • Sharepoint site is published through ISA/TMG server
      • The user authenticates to the published site using Basic authentication (popup window appears)
      • When the user tries to open a document from a document library he needs to authenticate again, sometimes even multiple times
      • If the user accesses the Sharepoint site directly, from inside the network, he experience no issues
      The reason this happens is that Office applications look for persistent authentication cookies that should be stored on the local client. If the authentication cookie is not available we get the log on prompt when we use Word for example to open a docx file from a Sharepoint site. We do not get the authentication prompt when we access the Sharepoint site from internal network because the Office applications authenticate using Windows Integrated authentication and we are saved from typing credentials in this case.

      This persistent cookie sharing between IE and Office applications is described here in more detail:

      So how do we get the persistent cookie from site published through ISA/TMG server?

      Complete the following steps:
      • Switch from Basic authentication in ISA/TMG listener to Forms-based authentication
      • Turn the Persistent cookies support on ISA/TMG listener
      • Add the site to the Trusted Sites list in Internet Explorer client
      • If we are using Internet Explorer 7/8 we also need to turn of "Protected Mode" for Trusted Sites (it should be off by default
      Now we only need to login once using Forms-based authentication window and we can open any document from Sharepoint site without additional authentication prompts. It is because the cookie has now been saved to the location from which Office applications can access it.
      For detailed steps on how to turn on "Persistent Cookies" on ISA/TMG listener please visit this blog:

      You will notice one more thing and that is if you exit the Internet Explorer window without first logging off from the Sharepoint site, when you visit the same site again you will not be presented with the Forms-based authetication window but instead you will be automatically authenticated to the site. This is because cookie is "persistent" and it will only be deleted if you explicitly log off or until the the cookie times out. On ISA/TMG listener you have an option to use persistent cookies "Only on private computers" or "On all computers" which is an option that you choose when you enter your credentials on the Forms-based authentication window and you can additionally set the timeout in minutes separately for Public and Private computers.


      Microsoft NLB and HP Procurve switches?


      Last week I encountered and environment where I had to setup Microsoft NLB for Exchange 2010 CAS and Hub server. The environment was a mixture of VMware 3.5 virtualization deployed in two separate system rooms and a mixture of Cisco switches in IBM blade chassis and an HP Procurve 5300 which was used as a central routing switch.

      The main question was will the NLB work since the Internet search I did told that it was not possible to setup a static ARP record on HP Procurve switches which is required on Cisco switches if I we want for multicast NLB virtual IP to be available across different subnets. I could not find the answer if multicast NLB works across subnets on HP Procurve and we were additionally concerned because we used virtual machines across two separate rooms for NLB and thus switches that are physically separate but the machines were still in the same VLAN. The unicast NLB was not an option because VMware support this only if virtual machines are running on the same physical host.

      So we decided just to setup the NLB on Windows hosts and see if it works. And, it worked! :)
      The reason this works is probably in the fact that HP obviously accepts ARP replies for unicast IP addresses that contains multicast MAC addresses. Cisco devices in turn do not allow this (see this link).

      Please leave a comment if you have more info on this subject.