Tuesday, April 23, 2013

SharePoint 2013 Multilingual User Interface (MUI) Switcher

In SharePoint Foundation 2010, when someone navigates to a multilingual website, the website uses the Accept-Language header that the client browser sends with the HTTP request to determine the language in which to render the user interface. If the website does not support any of the languages specified by the browser, the default language is used as the display language
A multilingual website also displays a drop-down menu in the upper-right corner of the page, next to the user's name, where users can select a display language. When someone selects a language that is different from the current display language, the website switches to the new language. The user's preference is persisted in a cookie that is dropped on the client computer. The website gets the user's language preference from the cookie on subsequent visits to the site.



But In SharePoint 2013  the language switcher drop down is removed ... So We have tried to implement our own switcher with same concept in SharePoint 2010 - Reference the Changing the Display Language  - But trying to add the JavaScript was useless.

So we tried another approach ... Using an HTTP Module which interrupts the request in a very early stage - Reference this Page to see how to create and apply an HTTPModule to SharePoint  - .

The trick is the new SharePoint model for changing the display language  is by checking the user's language preferences ... Then adding those languages to the request header in the Accept-Language tag i.e. ar-SA,en-US or buy the language preferences the user configures in his SharePoint user profile. 
  
English Preference Only

Arabic then English Preference 

OK ... we will not reinvent the wheel ... The game plan is to interrupt the request and check for the cookies value - Assuming we have a cookie  preserving the current selected language - Adding the selected language at the to the request header using an HTTP Handler , Then setting the current thread culture to the selected language, By implementing the PreSendRequestHeaders event handler.

UPDATE: Thanks to Suleman, many people was facing some issues in this approach, He found using the PreRequestHandlerExecute event handler solves all issues

e.g context.PreRequestHandlerExecute +=context_PreRequestHandlerExecute;


Following is the code we used for achieving the above scenario by setting the language to arabic:



using System;
using System.Collections.Generic;
using System.Linq;
using System.Text;
using System.Web;
using System.Threading.Tasks;
using System.Threading;
 
namespace MUISwitcher
{
    class HTTPSwitcherModule : IHttpModule
    {
        #region IHttpModule Members
 
        public void Dispose()
        {
        }
 
        public void Init(HttpApplication context)
        {
            context.PreRequestHandlerExecute +=context_PreRequestHandlerExecute;
        }
 
        void context_PreRequestHandlerExecute(object sender, EventArgs e)
        {
            HttpApplication httpApp = sender as HttpApplication;
            HttpContext context = httpApp.Context;
            string httpUrl = context.Request.Url.ToString();
 
            //TODO:Get the selected value for the current culture form the cookie i.e. ar-SA and 
            //set the Header and the CurrentCulture to the aquired value
 
            var lang = context.Request.Headers["Accept-Language"];
 
            if (!lang.Contains("ar-SA"))
                context.Request.Headers["Accept-Language"] = "ar-SA," + context.Request.Headers["Accept-Language"];
 
 
            var culture = new System.Globalization.CultureInfo("ar-SA");
 
            Thread.CurrentThread.CurrentCulture = culture;
            Thread.CurrentThread.CurrentUICulture = culture;
        }
 
        #endregion
    }
}

Sunday, April 14, 2013

Publishing Page Layout Image Field Disappears when the page is Published

I was facing this problem that when Publishing Page Layout the Image Field Disappears when the page is Published and I found out that the page layout I was using was not associated with the custom content type, as I was using SharePoint designer to copy the layout from one environment to another one, So I opened the master page gallery and edited the properties for this layout and re-associated it with the content type , and everything is working OK right now

Tuesday, February 5, 2013

Getting the login faulire cause for TMG login using a sharepoint 2010 application page

Microsoft Forefront Threat Management Gateway (Forefront TMG), formerly known as Microsoft Internet Security and Acceleration Server (ISA Server), is a network security and protection solution for Microsoft Windows, described by Microsoft as "enables businesses by allowing employees to safely and productively use the Internet for business without worrying about malware and other threats"

SharePoint sites comprise one of the more common types of content that are secured by the Forefront Edge line. This stems from the critical need to provide remote document management while at the same time securing that access. Although Forefront UAG is the preferred solution for reverse proxy of a SharePoint environment, the Forefront TMG product is also a highly capable product that allows for reverse proxy functionality.

The login page behavior for the TMG server on an invalid login trial is only to inform the user that either the user name or password is incorrect, so the user keeps trying till his account becomes locked, Even the invalid trial would be of an expired password a locked account , the user name is wrong or the password is actually wrong .

You may have for you business need to specify to the user why did this trial fail.

Ok, I do not have any experience in  TMG, but I needed to get a workaround to solve this issue, I found that the login page for TMG is rendered from a HTML file under my TMG rule folder this file is usr_pwd.htm. a lot of people is talking about customizing this file for branding and applying your corporate look and feel style to this page, But how about changing its behavior to redirect to another page when the user fails to login to specify the  failure cause.

And here is the solution I managed to find - This may be not the best solution based on my limited experience in TMG but it works fine this is what we all care about in the end :) -

The game play plan is as the follwing:
-Once the user clicks the "Sign in" button, we will save his user name in a cookie.
-The sign in button causes page post back, So in the the page load we will check if the login was not successful, then we will redirect the user to an SharePoint application page , providing this page with the cookie reserved user name in the page query string
-In the application page , using some LDAB methods we will be able to find out why the user was not able to login by checking the following:
  1. First check the Lock Out Time property for the user if it is not "0" so the account is locked
  2. Second check the password last set property if it is "0" so the password was not ever set by the user
  3. Check the max password age property vs. the password last set property, if the time span between them both is less than zero then the password has expire 
  4. Else check the user account control property to get the current status of the account .
Lets start some coding....

- "Sign in" button, we will save his user name in a cookie
 To achive this we will edit the file "usr_pwd.htm" its path would be something like "C:\Program Files\Microsoft Forefront Threat Management Gateway\Templates\CookieAuthTemplates\{YOUR FOLDER NAME}\HTML\usr_pwd.htm"

Using some helper JavaScript methods already used in the "flogon.js" file , we will add the JavaScript functions that will create the UserName cookie and get this cookie and check wither the cookies is enabled or not

 

     function setUsernameCookie(){
        var userName = getUser().value;
        if (chkCookies()) {
        document.cookie="";
        setCookie("UserName",userName,20*1000);
        }
        else{
        alert("Cookies not enabled");
        }
    }
    function setCookie(c_name,value,exdays){
        var exdate=new Date();
        exdate.setDate(exdate.getDate() + exdays);
        var c_value=escape(value) + ((exdays==null) ? "" : "; expires="+exdate.toUTCString());
        document.cookie=c_name + "=" + c_value;
    }
    
    function getCookie(c_name){
        var i,x,y,ARRcookies=document.cookie.split(";");
        for (i=0;i<ARRcookies.length;i++){
          x=ARRcookies[i].substr(0,ARRcookies[i].indexOf("="));
          y=ARRcookies[i].substr(ARRcookies[i].indexOf("=")+1);
          x=x.replace(/^\s+|\s+$/g,"");
          if (x==c_name)
            {
            return unescape(y);
            }
          }
    }
As you see there is some JavaScript functions used like "getUser()" and "chkCookies()" that are already used in the "flogon.js" so we will just reuse them - We will not reinvent the wheel :)-

Now we need to bind the "setUsernameCookie()" function to the onclick event that is easy , Find the input element that have the id "SubmitCreds" and add the onClick attribute as follow onClick="setUsernameCookie()"

Ok step one save user name in a cookie DONE.

- On page load check if the login was not successful, then redirect to a SharePoint application page

Now the user had clicked sign in, the cookie is filled with his user name, but how we will know it was an unsuccessful trial, This is the worst part that was made in the unprofessional way  :(, but this how I managed to do it , I checked the td element that the login failure message will show in , Give it id , and in the page load checked if the message is the failure message then redirected the user .

After some elements inspection I found that the failure message is in the td element with class="wrng" and the inner text is @@INSERT_USER_TEXT so i found it in the html and gave it the following id tdLoginMessage, the result will be :
  


<td class="wrng" id="tdLoginMessage">@@INSERT_USER_TEXT</td>

Now in the page load if the tdLoginMessage contains the failure message redirect to an application page with the user name supplied in the page query string , AND YOU WILL BE BACK TO HE SHAREPOINT WORLD FINALLY :)



function window_onload() {
        onld();
        if(gbid("tdLoginMessage").innerText.indexOf("عذّر تسجيل دخولك إلى النظام. تأكد من صحة اسم المجال، واسم المستخدم، وكلمة المرور، ثم حاول من جديد") != -1){
            
            var userName = getCookie("UserName");
            
            if( userName != ""){
            window.location = "http://yoursever.local/_layout/Public/ValidateLoginFailure.aspx?UserName=" + userName;
            }
        }
        
        if (chkCookies()) {
            ldCookie('username', 'password');
 
            var expl1 = document.getElementById('expl1');
            expl1.style.display = "";
 
            var lnkHidedSection = document.getElementById('lnkHdSec');
            lnkHidedSection.style.display = "none";
 
            var lnkShowSection = document.getElementById('lnkShwSec');
            lnkShowSection.style.display = "";
        }
        
        
    }
In my case my message was in this arabic format and the application page created was ValidateLoginFailure.aspx

-Find out why the user was not able to login
Back to happy land, Now we will create an application page with the following properties:
-Map your page to a public folder where you have the anonymous rule on ISA enabled for this path as this page will be accessible by anonymous user - The guy faild to login he is still anonymous :) -
-Override the AllowAnonymousAccess property to return true to be able to be viewable by anonymous users



protected override bool AllowAnonymousAccess
{
    get
    {
        return true;
    }
}
Add the following to youe web.config to enable the page for anonymous

  <location path="_layouts/Public/ValidateLoginFailure.aspx">
    <system.web>
      <identity impersonate="true" />
      <authorization>
        <allow users="?" />
      </authorization>
    </system.web>
  </location>
Ok, in the page load we will get the user name and using the LDAP query we will get the login failure.

Now you should be redirected from the TMG login page with a URL looking like :  http://yoursever.local/_layout/Public/ValidateLoginFailure.aspx?UserName=Domain\LoginName

We will get the DirectoryEntry object for this user to be able to get the set of properties needed to know the failure cause by the following code:
  1. First check the Lock Out Time property for the user if it is not "0" so the account is locked
  2. Second check the password last set property if it is "0" so the password was not ever set by the user
  3. Check the max password age property vs. the password last set property, if the time span between them both is less than zero then the password has expire 
  4. Else check the user account control property to get the current status of the account .

protected void Page_Load(object sender, EventArgs e)
{
    try
    {
        if (Page.Request != null)
        {
            if (Page.Request.QueryString["UserName"] != null)
            {
                string UserName = Page.Request.QueryString["UserName"].ToString();
 
                if (HttpContext.Current.Session != null)
                {
                    uint LCID = (uint)HttpContext.Current.Session.LCID;
 
                    string StatusValue = String.Empty;
 
                    DirectoryEntry entry = null;
                    using (HostingEnvironment.Impersonate())
                    {
                        entry = new DirectoryEntry("LDAP://MyDomain/CN=" + UserName +",DC=MyDomain,DC=local");
 
                        if (entry != null)
                        {
                            long pwdLastSet = 2;
                            long ldate  = 0;
 
                            if (entry.Properties["pwdLastSet"].Value != null)
                            {
                                pwdLastSet = LongFromLargeIntegerObject(entry.Properties["pwdLastSet"].Value);
                            }
                            
                            if (entry.Properties["LockOutTime"].Value != null)
                            {
                                ldate = LongFromLargeIntegerObject(entry.Properties["LockOutTime"].Value);
                            }
                            
 
                            double? TimeRemainingUntilPasswordExpiration = GetTimeRemainingUntilPasswordExpiration(entry);
 
 
                            //Account Locked
                            if (ldate != 0)
                            {
                                StatusValue = GetLocalizedString("Status_LockedAccount", LCID) + DateTime.FromFileTime(ldate).ToString("dd/MM/yyyy");
                            }
                            //Account Password Expired
                            else if (pwdLastSet == 0)
                            {
                                StatusValue = GetLocalizedString("Status_PasswordExpired", LCID);
                            }
                            else if (TimeRemainingUntilPasswordExpiration != null && TimeRemainingUntilPasswordExpiration < 0)
                            {
                                StatusValue = GetLocalizedString("Status_PasswordExpired", LCID);
                            }
                            //Account Expired
                            else if (entry.ExpirationDate > new DateTime(2000, 1, 1, 2, 0, 0) && entry.ExpirationDate <= DateTime.Now)
                            {
                                StatusValue = GetLocalizedString("Status_AccountExpired", LCID);
                            }
                            else
                                StatusValue = GetStatusValue(LCID, StatusValue, entry);
 
                            SendNotificationMail(entry, LCID, StatusValue);
                            //lblStatus.Text = StatusValue;
                        }
                        else
                        {
                            lblStatus.Text = "Could not Query AD LDAP Operation , Entity object is NULL";
                        }
                    }        
                }
                else
                {
                    Response.Write("<!-- session null -->");
                }
            }
            else
            {
                lblStatus.Text = "No Query String supplied";
            }
        }
        else
        {
            lblStatus.Text = "Request rejected";
 
        }
    }
    catch (Exception ex)
    {
        lblStatus.Text = ex.Message + " " + ex.StackTrace;
    }
}
Following are the helper methods used in the page load method :


private static double? GetTimeRemainingUntilPasswordExpiration(DirectoryEntry entry)
{
    if (entry.Properties.Contains("maxPwdAge"))
    {
        long pwdLastSet = LongFromLargeIntegerObject(entry.Properties["pwdLastSet"].Value);
 
        if (entry.Properties["maxPwdAge"].Value != null && pwdLastSet != 0)
        {
            var maxPasswordAge = int.Parse(entry.Properties["maxPwdAge"].Value.ToString());
 
            return maxPasswordAge - (DateTime.Now - DateTime.FromFileTime(pwdLastSet)).TotalDays;
        }
        else
            return null;
    }
    else
        return null;
}
 
/// <summary>
///  Used for the date time of the locked out account
/// </summary>
/// <param name="largeInteger"></param>
/// <returns></returns>
private static long LongFromLargeIntegerObject(object largeInteger)
{
    System.Type type = largeInteger.GetType();
    int highPart = (int)type.InvokeMember("HighPart", BindingFlags.GetProperty, null,
  largeInteger, null);
    int lowPart = (int)type.InvokeMember("LowPart", BindingFlags.GetProperty, null, largeInteger, null);
    return (long)highPart << 32 | (uint)lowPart;
} 
Now if all those cases are not met we need to check the user account control property to get the current status of the account .

private static string GetStatusValue(uint LCID, string StatusValue, DirectoryEntry entry)
{
    string AccountControlValue = entry.Properties["userAccountControl"].Value.ToString();
 
    switch (AccountControlValue)
    {
        case "512":
            StatusValue = "Status_EnabledAccount";
            break;
        case "514":
            StatusValue = "Status_DisabledAccount";
            break;
        case "544":
            StatusValue = "Status_EnabledPasswordNotRequired";
            break;
        case "546":
            StatusValue = "Status_DisabledPasswordNotRequired";
            break;
        case "66048":
            StatusValue = "Status_EnabledPasswordDoesnotExpire";
            break;
        case "66050":
            StatusValue = "Status_DisabledPasswordDoesnotExpire";
            break;
        case "66080":
            StatusValue = "Status_EnabledPasswordDoesnotExpireNotRequired";
            break;
        case "66082":
            StatusValue = "Status_DisabledPasswordDoesnotExpireNotRequired";
            break;
        case "262656":
            StatusValue = "Status_EnabledSmartcardRequired";
            break;
        case "262658":
            StatusValue = "Status_DisabledSmartcardRequired";
            break;
        case "262688":
            StatusValue = "Status_EnabledSmartcardRequiredPasswordNotRequired";
            break;
        case "262690":
            StatusValue = "Status_DisabledSmartcardRequiredPasswordNotRequired";
            break;
        case "328192":
            StatusValue = "Status_EnabledSmartcardRequiredPasswordDoesnotExpire";
            break;
        case "328194":
            StatusValue = "Status_DisabledSmartcardRequiredPasswordDoesnotExpire";
            break;
        case "328224":
            StatusValue = "Status_EnabledSmartcardRequiredPasswordDoesnotExpireNotRequired";
            break;
        case "328226":
            StatusValue = "Status_DisabledSmartcardRequiredPasswordDoesnotExpireNotRequired";
            break;
        default:
            StatusValue = "Status_NotAvailable";
            break;
    }
    return StatusValue ;
}
 
 
Hope you find this post useful :)

Wednesday, January 30, 2013

Creating external system (BCS) Search content sources and binding carwled properties to metadata properties programmatically

Business Connectivity Services (BCS) is a set of services and features that connect SharePoint-based solutions to sources of external data. It is included in SharePoint Foundation 2010, SharePoint Server 2010, and Office 2010 applications.

Search Service Application (SSA) Allows for the content to be crawled, indexed, and then allows for users to get results through search queries.

Ok, what about combing those two nice features together , The power of BCS to get data from external data sources and the power of SSA to crawl this data, index it and query it in just no time.

Searching you content using SSA can be configured from the central admin like a piece of cake , according to those articles MSDN articles Part 1 & Part 2, and this nice article

In this article I assume that you have gone among all these hassle before, But what about creating  this whole structure in just one click, on feature activation for example.

What about a site scope feature that would create a Business search content source , then creates a search scope associated to this content source ,Perform a full crawl to get the crawled external fields then connect those fields to a managed properties to be easy for advanced search query.

Ok lets start.

I am assume that you have created an external content type with LOB system named "LOBSystemName" and LOB system instance named "LOBSystemInstanceName"

to create the business content source you need to do the following


   1:  using (SPSite site = new SPSite(SiteURL))
   2:              {
   3:                  SearchContext context = SearchContext.GetContext(site);
   4:   
   5:                  Content BSCContent = new Content(context);
   6:   
   7:                  ContentSourceCollection BSCContentSourceCollection = BSCContent.ContentSources;
   8:                  string NewContentSource = "New Content Source Title";
   9:   
  10:                  if (BSCContentSourceCollection.Exists(NewContentSource))
  11:                  {
  12:                      Console.WriteLine("Content Source Already Exsist");
  13:                      
  14:                      return false;
  15:                  }
  16:                  else
  17:                  {
  18:                      try
  19:                      {
  20:                          BusinessDataContentSource BSCContentSource = (BusinessDataContentSource)BSCContentSourceCollection.Create(typeof(BusinessDataContentSource), NewContentSource);
  21:   
  22:                          BSCContentSource.StartAddresses.Add(BusinessDataContentSource.ConstructStartAddress("Default", new Guid("00000000-0000-0000-0000-000000000000"), "LOBSystemName", "LOBSystemInstanceName"));
  23:   
  24:                          BSCContentSource.StartFullCrawl();
  25:   
  26:                          return true;
  27:                      }
  28:                      catch (Exception ex)
  29:                      {
  30:                          Console.WriteLine("Faild to ceate content source");
  31:                          Console.WriteLine(ex.Message);
  32:                          
  33:                          throw new Exception("Faild to ceate content source \n" + ex.Message);
  34:                      }
  35:                  }
  36:   
  37:              }

Now the content source is created you will need to create a search scope to be associated with the newly created content source  the following method will do this task :


/// <summary>
        /// Create new Content Source type Search Scope
        /// </summary>
        /// <param name="site">The new created site </param>
        /// <param name="context">The Search context to create the Search Scope within</param>
        /// <param name="ContentSourceName">The Search content source name to be associated to the new Search Scope</param>
        static private void CreateBCSSearchScope(SPSite site,SearchContext context, string ContentSourceName )
        {
            string scopeName = ContentSourceName ;
            string displayGroupName = "GTS Scopes";
            // remotescopes class retrieves information via search web service so we run this as the search service account
 
            RemoteScopes remoteScopes = new RemoteScopes(SPServiceContext.GetContext(site));
 
 
 
            // see if there is an existing scope
 
            Scope scope = (from s
 
                           in remoteScopes.GetScopesForSite(new Uri(site.Url)).Cast<Scope>()
 
                           where s.Name == scopeName
 
                           select s).FirstOrDefault();
 
 
 
            // only add if the scope doesn't exist already
 
            if (scope == null)
            {
                Schema sspSchema = new Schema(context);
                ManagedPropertyCollection properties = sspSchema.AllManagedProperties;
                scope = remoteScopes.AllScopes.Create(scopeName, "Search Scope for " + scopeName, null, true, "results.aspx", ScopeCompilationType.AlwaysCompile);
                scope.Rules.CreatePropertyQueryRule(ScopeRuleFilterBehavior.Include, properties["ContentSource"], ContentSourceName);
 
            }
 
 
 
            // see if there is an existing display group         
 
            ScopeDisplayGroup displayGroup = (from d
 
                                              in remoteScopes.GetDisplayGroupsForSite(new Uri(site.Url)).Cast<ScopeDisplayGroup>()
 
                                              where d.Name == displayGroupName
 
                                              select d).FirstOrDefault();
 
 
 
            // add if the display group doesn't exist
 
            if (displayGroup == null)
 
                displayGroup = remoteScopes.AllDisplayGroups.Create(displayGroupName, "", new Uri(site.Url), true);
 
 
 
            // add scope to display group if not already added
 
            if (!displayGroup.Contains(scope))
            {
 
                displayGroup.Add(scope);
 
                displayGroup.Default = scope;
 
                displayGroup.Update();
 
            }
 
 
 
            // optionally force a scope compilation so this is available immediately
 
            remoteScopes.StartCompilation();
        }
The site is the current site you are creating this whole topology for and the Search context is already defined in the previous method

OK, Recap , Content Source Created DONE , Search Scope created and associated DONE, Now we need to perform a full crawl to get Crawled properties "Database table fields  in our case" to be able to map those properties to a managed metadata properties.

to do this step we need to make a while loop to just start the crawl and waits till its done then call the create managed property method... the following code snippet is the key :


                        BSCContentSource.StartFullCrawl();
 
                        Console.WriteLine("Carwling wil start in 10 secounds");
                        Thread.Sleep(10 * 1000);
                        Console.WriteLine("Carwling Started");
 
                        do
                        {
                            Thread.Sleep(10 * 1000);
                            Console.WriteLine("Waiting the content source to finish crawling..");
                        } while (BSCContentSource.CrawlStatus != CrawlStatus.Idle);
 
                        Console.WriteLine("Crawling has been done successfully !");
                        PMFLogger.Instance.LogInformation("Crawling has been done successfully !");
 
                        //Start creating/mapping the new Metadata Properties
                        CreateBCSMetadataProperties(context, ModelName);
Now the data is crawled, indexed ....ok lets create the managed properties and map the crawled fields to those properties , In my case i had the following Columns in the SQL database -External data source -

ID, Title, StartDate, EndDate, Entity, Type, OrganizationUnit, Owner, Alias and StrategyPlan.


        /// <summary>
        /// Creates new Business Search Metadata Properties to be used in the search
        /// </summary>
        /// <param name="context">The Search context to create the Business Search Metadata Properties within</param>
        /// <param name="ModelName">The newly created BCS Model name to aquire and map columns from</param>
        static private void CreateBCSMetadataProperties(SearchContext context,string ModelName)
        {
            Schema sspSchema = new Schema(context);
            ManagedPropertyCollection properties = sspSchema.AllManagedProperties;
 
            //Create the content properties if they does not exist else get the already created ones for the mappings
            ManagedProperty ectID;
            ManagedProperty ectTitle;
            ManagedProperty ectStartDate;
            ManagedProperty ectEndDate;
            ManagedProperty ectEntity;
            ManagedProperty ectType;
            ManagedProperty ectOrganizationUnit;
            ManagedProperty ectOwner;
            ManagedProperty ectAlias;
            ManagedProperty ectStrategyPlan;
            
 
            if (!properties.Contains("ectID"))
            {
                ectID = properties.Create("ectID", ManagedDataType.Text);
                ectID.EnabledForScoping = true;
            }
            else
                ectID = properties["ectID"];
 
            if (!properties.Contains("ectTitle"))
            {
                ectTitle = properties.Create("ectTitle", ManagedDataType.Text);
                ectTitle.EnabledForScoping = true;
            }
            else
                ectTitle = properties["ectTitle"];
 
            if (!properties.Contains("ectStartDate"))
            {
                ectStartDate = properties.Create("ectStartDate", ManagedDataType.DateTime);
                ectStartDate.EnabledForScoping = true;
                ectStartDate.HasMultipleValues = false;
            }
            else
                ectStartDate = properties["ectStartDate"];
 
            if (!properties.Contains("ectEndDate"))
            {
                ectEndDate = properties.Create("ectEndDate", ManagedDataType.DateTime);
                ectEndDate.EnabledForScoping = true;
                ectEndDate.HasMultipleValues = false;
            }
            else
                ectEndDate = properties["ectEndDate"];
            
            if (!properties.Contains("ectEntity"))
            {
                ectEntity = properties.Create("ectEntity", ManagedDataType.Text);
                ectEntity.EnabledForScoping = true;
                ectEntity.HasMultipleValues = false;
            }
            else
                ectEntity = properties["ectEntity"];
 
            if (!properties.Contains("ectType"))
            {
                ectType = properties.Create("ectType", ManagedDataType.Text);
                ectType.EnabledForScoping = true;
                ectType.HasMultipleValues = false;
            }
            else
                ectType = properties["ectType"];
 
            if (!properties.Contains("ectOrganizationUnit"))
            {
                ectOrganizationUnit = properties.Create("ectOrganizationUnit", ManagedDataType.Text);
                ectOrganizationUnit.EnabledForScoping = true;
                ectOrganizationUnit.HasMultipleValues = false;
            }
            else
                ectOrganizationUnit = properties["ectOrganizationUnit"];
 
            if (!properties.Contains("ectOwner"))
            {
                ectOwner = properties.Create("ectOwner", ManagedDataType.Text);
                ectOwner.EnabledForScoping = true;
                ectOwner.HasMultipleValues = false;
            }
            else
                ectOwner = properties["ectOwner"];
 
            if (!properties.Contains("ectAlias"))
            {
                ectAlias = properties.Create("ectAlias", ManagedDataType.Text);
                ectAlias.EnabledForScoping = true;
                ectAlias.HasMultipleValues = false;
                ectAlias.MaxCharactersInPropertyStoreIndex = 450;
            }
            else
                ectAlias = properties["ectAlias"];
 
 
            if (!properties.Contains("ectStrategyPlan"))
            {
                ectStrategyPlan = properties.Create("ectStrategyPlan", ManagedDataType.Text);
                ectStrategyPlan.EnabledForScoping = true;
                ectStrategyPlan.HasMultipleValues = false;
            }
            else
                ectStrategyPlan = properties["ectStrategyPlan"];
            
 
            //Map the Query Crawled Properties to the Managed Property
            MaptoManagedProperty(context, ectID, ModelName + " Items.ID", ManagedDataType.Text);
            MaptoManagedProperty(context, ectTitle, ModelName + " Items.Name", ManagedDataType.Text);
            MaptoManagedProperty(context, ectStartDate, ModelName + " Items.StartDate", ManagedDataType.DateTime);
            MaptoManagedProperty(context, ectEndDate, ModelName + " Items.EndDate", ManagedDataType.DateTime);
            MaptoManagedProperty(context, ectEntity, ModelName + " Items.Entity", ManagedDataType.Text);
            MaptoManagedProperty(context, ectType, ModelName + " Items.Type", ManagedDataType.Text);
            MaptoManagedProperty(context, ectOrganizationUnit, ModelName + " Items.OrganizationUnit", ManagedDataType.Text);
            MaptoManagedProperty(context, ectOwner, ModelName + " Items.Owner", ManagedDataType.Text);
            MaptoManagedProperty(context, ectAlias, ModelName + " Items.Alias", ManagedDataType.Text);
            MaptoManagedProperty(context, ectStrategyPlan, ModelName + " Items.StrategyPlan", ManagedDataType.Text);
        }
 
        /// <summary>
        /// Maps the external content type columns to a specific Managed Property
        /// </summary>
        /// <param name="context">The Search context to map the the external content type columns within</param>
        /// <param name="managedProperty">The managed property to map the column to</param>
        /// <param name="crawledPropertyName">The crawled property "Column" Name</param>
        /// <param name="DataType"></param>
        private static void MaptoManagedProperty(SearchContext context, ManagedProperty managedProperty, string crawledPropertyName, ManagedDataType DataType)
        {
            SPSecurity.RunWithElevatedPrivileges(() =>
            {
                Schema schema = new Schema(context);
 
                try
                {
                    Category category = schema.AllCategories["Business Data"];
                    var crawledProps = category.QueryCrawledProperties(crawledPropertyName, 1, Guid.NewGuid(), String.Empty, true).Cast<CrawledProperty>();
                    var crawledProp = crawledProps.FirstOrDefault();
                    if (crawledProp != null)
                    {
                        var mappings = managedProperty.GetMappings();
                        mappings.Add(new Mapping(crawledProp.Propset, crawledProp.Name, crawledProp.VariantType, managedProperty.PID));
                        managedProperty.SetMappings(mappings);
                        managedProperty.Update();
                    }
                    else
                    {
                        Console.WriteLine("Query Crawled Property " + crawledPropertyName + " was not found - Mapping faild.");
                    }
                }
                catch (Exception ex)
                {
                    throw new Exception("Faild to map field to Crawled Property \n" + ex.Message);
                }
            });
 
        }
Ok you will find some objects that needs to be described:
The "ModelName" object is the External content type model name

Why is "EnabledForScoping" set to true .... that is easy to able the new Managed properties to be exposed to search scopes

Why is some properties "HasMultipleValues" set to false ..... ok if your Managed property is a datetime or int the return data in the search result will be System.DateTime[] not the actual crawled value by setting this property to false issue solved :)

Why is "MaxCharactersInPropertyStoreIndex" set to 450 , this to Reduce storage requirements for text properties by using a hash for comparison, or simply to be able to order search results by this metadata property.

The Whole code in one block .... Have a nice day :)



using System;
using System.Collections.Generic;
using System.Linq;
using System.Text;
 
using Microsoft.SharePoint;
using Microsoft.SharePoint.Client;
using Microsoft.SharePoint.Administration;
using Microsoft.Office.Server;
using Microsoft.Office.Server.Search.Administration;
using System.Threading;
 
 
namespace BCSModelGeneration
{      
    public class ContentSourceGenerator
    {
        /// <summary>
        /// Create Bussiness Connectivity Service LOB SharePoint Content Source
        /// </summary>
        /// <param name="InitialCatalog">Rdb Connection tenant Initial Catalog</param>
        /// <param name="SiteURL">The new created site - Entity</param>
        /// <param name="ModelName">The new created Model Name </param>
        /// <returns>The status of the LOB SharePoint Content Source creation</returns>
        internal static bool CreateBCSContentSource(string InitialCatalog, string SiteURL, string ModelName)
        {
            using (SPSite site = new SPSite(SiteURL))
            {
                SearchContext context = SearchContext.GetContext(site);
 
                Content BSCContent = new Content(context);
 
                ContentSourceCollection BSCContentSourceCollection = BSCContent.ContentSources;
                string NewContentSource = InitialCatalog;
 
                if (BSCContentSourceCollection.Exists(NewContentSource))
                {
                    Console.WriteLine("Content Source Already Exsist");
                    return false;
                }
                else
                {
                    try
                    {
                        BusinessDataContentSource BSCContentSource = (BusinessDataContentSource)BSCContentSourceCollection.Create(typeof(BusinessDataContentSource), NewContentSource);
 
                        BSCContentSource.StartAddresses.Add(BusinessDataContentSource.ConstructStartAddress("Default", new Guid("00000000-0000-0000-0000-000000000000"), InitialCatalog, InitialCatalog));
 
                        WeeklySchedule Weekly = CreatWeeklySchedule(context, 2);
 
                        BSCContentSource.FullCrawlSchedule = Weekly;
 
                        DailySchedule Daily = CreateDailySchedule(context, 2);
                        BSCContentSource.IncrementalCrawlSchedule = Daily;
 
                        BSCContentSource.Update();
 
                        BSCContentSource.StartFullCrawl();
 
                        Console.WriteLine("Carwling wil start in 10 secounds");
                        Thread.Sleep(10 * 1000);
                        Console.WriteLine("Carwling Started");
                        //Start creating the new Search Scope
                        CreateBCSSearchScope(site, context, BSCContentSource.Name);
 
                        do
                        {
                            Thread.Sleep(10 * 1000);
                            Console.WriteLine("Waiting the content source to finish crawling..");
                        } while (BSCContentSource.CrawlStatus != CrawlStatus.Idle);
 
                        Console.WriteLine("Crawling has been done successfully !");
 
                        //Start creating/mapping the new Metadata Properties
                        CreateBCSMetadataProperties(context, ModelName);
 
                        Console.WriteLine("Content Source Created");
 
                        Console.WriteLine("Starting a new Crawling process to the content source to fill the new mapped Metadata Properties");
 
                        BSCContentSource.StartFullCrawl();
 
                        return true;
                    }
                    catch (Exception ex)
                    {
                        Console.WriteLine("Faild to ceate content source");
                        Console.WriteLine(ex.Message);
                        throw new Exception("Faild to ceate content source \n" + ex.Message);
                    }
                }
 
            }
        }
 
        /// <summary>
        /// Creats a Weekly Schedule for the search content source
        /// </summary>
        /// <param name="context">The Search context to create the schedule within</param>
        /// <param name="WeeksInterval">Indicates that the content should be crawled every "WeeksInterval" number of weeks</param>
        /// <returns></returns>
        private static WeeklySchedule CreatWeeklySchedule(SearchContext context,int WeeksInterval)
        {
            WeeklySchedule Weekly = new WeeklySchedule(context);
 
            Weekly.BeginDay = DateTime.Now.Day;
            Weekly.BeginMonth = DateTime.Now.Month;
            Weekly.BeginYear = DateTime.Now.Year;
            //Starts at 1:00 AM
            Weekly.StartHour = 1;
            Weekly.StartMinute = 00;
            //Indicates that the content should be crawled every WeeksInterval weeks.
            Weekly.WeeksInterval = WeeksInterval;
            return Weekly;
        }
 
        /// <summary>
        /// Creats a Daily Schedule for the search content source
        /// </summary>
        /// <param name="context">The Search context to create the schedule within</param>
        /// <param name="DaysInterval">Indicates that the content should be crawled every "DaysInterval" number of days.</param>
        /// <returns></returns>
        private static DailySchedule CreateDailySchedule(SearchContext context,int DaysInterval)
        {
            DailySchedule Daily = new DailySchedule(context);
            Daily.BeginDay = DateTime.Now.Day;
            Daily.BeginMonth = DateTime.Now.Month;
            Daily.BeginYear = DateTime.Now.Year;
            //Starts at 1:00 AM
            Daily.StartHour = 1;
            Daily.StartMinute = 00;
            //Indicates that the content should be crawled every DaysInterval days.
            Daily.DaysInterval = DaysInterval;
            //Adjusting the daily schedule to run every hour
            //Hourly.RepeatInterval = 60;
            //Hourly.RepeatDuration = 1440;
            return Daily;
        }
 
        /// <summary>
        /// Create new Content Source type Search Scope
        /// </summary>
        /// <param name="site">The new created site - Entity</param>
        /// <param name="context">The Search context to create the Search Scope within</param>
        /// <param name="ContentSourceName">The Search content source name to be associated to the new Search Scope</param>
        static private void CreateBCSSearchScope(SPSite site,SearchContext context, string ContentSourceName )
        {
            string scopeName = ContentSourceName ;
            string displayGroupName = "GTS Scopes";
            // remotescopes class retrieves information via search web service so we run this as the search service account
 
            RemoteScopes remoteScopes = new RemoteScopes(SPServiceContext.GetContext(site));
 
 
 
            // see if there is an existing scope
 
            Scope scope = (from s
 
                           in remoteScopes.GetScopesForSite(new Uri(site.Url)).Cast<Scope>()
 
                           where s.Name == scopeName
 
                           select s).FirstOrDefault();
 
 
 
            // only add if the scope doesn't exist already
 
            if (scope == null)
            {
                Schema sspSchema = new Schema(context);
                ManagedPropertyCollection properties = sspSchema.AllManagedProperties;
                scope = remoteScopes.AllScopes.Create(scopeName, "Search Scope for " + scopeName, null, true, "results.aspx", ScopeCompilationType.AlwaysCompile);
                scope.Rules.CreatePropertyQueryRule(ScopeRuleFilterBehavior.Include, properties["ContentSource"], ContentSourceName);
 
            }
 
 
 
            // see if there is an existing display group         
 
            ScopeDisplayGroup displayGroup = (from d
 
                                              in remoteScopes.GetDisplayGroupsForSite(new Uri(site.Url)).Cast<ScopeDisplayGroup>()
 
                                              where d.Name == displayGroupName
 
                                              select d).FirstOrDefault();
 
 
 
            // add if the display group doesn't exist
 
            if (displayGroup == null)
 
                displayGroup = remoteScopes.AllDisplayGroups.Create(displayGroupName, "", new Uri(site.Url), true);
 
 
 
            // add scope to display group if not already added
 
            if (!displayGroup.Contains(scope))
            {
 
                displayGroup.Add(scope);
 
                displayGroup.Default = scope;
 
                displayGroup.Update();
 
            }
 
 
 
            // optionally force a scope compilation so this is available immediately
 
            remoteScopes.StartCompilation();
        }
 
        /// <summary>
        /// Creates new Business Search Metadata Properties to be used in the search
        /// </summary>
        /// <param name="context">The Search context to create the Business Search Metadata Properties within</param>
        /// <param name="ModelName">The newly created BCS Model name to aquire and map columns from</param>
        static private void CreateBCSMetadataProperties(SearchContext context,string ModelName)
        {
            Schema sspSchema = new Schema(context);
            ManagedPropertyCollection properties = sspSchema.AllManagedProperties;
 
            //TODO: ADD THE NEW FIELDS ADDED TO THE VIEW AFTER FINALIZATION
            //Create the content properties if they does not exist else get the already created ones for the mappings
            ManagedProperty ectID;
            ManagedProperty ectTitle;
            ManagedProperty ectStartDate;
            ManagedProperty ectEndDate;
            ManagedProperty ectEntity;
            ManagedProperty ectType;
            ManagedProperty ectOrganizationUnit;
            ManagedProperty ectOwner;
            ManagedProperty ectAlias;
            ManagedProperty ectStrategyPlan;
            
 
            if (!properties.Contains("ectID"))
            {
                ectID = properties.Create("ectID", ManagedDataType.Text);
                ectID.EnabledForScoping = true;
            }
            else
                ectID = properties["ectID"];
 
            if (!properties.Contains("ectTitle"))
            {
                ectTitle = properties.Create("ectTitle", ManagedDataType.Text);
                ectTitle.EnabledForScoping = true;
            }
            else
                ectTitle = properties["ectTitle"];
 
            if (!properties.Contains("ectStartDate"))
            {
                ectStartDate = properties.Create("ectStartDate", ManagedDataType.DateTime);
                ectStartDate.EnabledForScoping = true;
                ectStartDate.HasMultipleValues = false;
            }
            else
                ectStartDate = properties["ectStartDate"];
 
            if (!properties.Contains("ectEndDate"))
            {
                ectEndDate = properties.Create("ectEndDate", ManagedDataType.DateTime);
                ectEndDate.EnabledForScoping = true;
                ectEndDate.HasMultipleValues = false;
            }
            else
                ectEndDate = properties["ectEndDate"];
            
            if (!properties.Contains("ectEntity"))
            {
                ectEntity = properties.Create("ectEntity", ManagedDataType.Text);
                ectEntity.EnabledForScoping = true;
                ectEntity.HasMultipleValues = false;
            }
            else
                ectEntity = properties["ectEntity"];
 
            if (!properties.Contains("ectType"))
            {
                ectType = properties.Create("ectType", ManagedDataType.Text);
                ectType.EnabledForScoping = true;
                ectType.HasMultipleValues = false;
            }
            else
                ectType = properties["ectType"];
 
            if (!properties.Contains("ectOrganizationUnit"))
            {
                ectOrganizationUnit = properties.Create("ectOrganizationUnit", ManagedDataType.Text);
                ectOrganizationUnit.EnabledForScoping = true;
                ectOrganizationUnit.HasMultipleValues = false;
            }
            else
                ectOrganizationUnit = properties["ectOrganizationUnit"];
 
            if (!properties.Contains("ectOwner"))
            {
                ectOwner = properties.Create("ectOwner", ManagedDataType.Text);
                ectOwner.EnabledForScoping = true;
                ectOwner.HasMultipleValues = false;
            }
            else
                ectOwner = properties["ectOwner"];
 
            if (!properties.Contains("ectAlias"))
            {
                ectAlias = properties.Create("ectAlias", ManagedDataType.Text);
                ectAlias.EnabledForScoping = true;
                ectAlias.HasMultipleValues = false;
                ectAlias.MaxCharactersInPropertyStoreIndex = 450;
            }
            else
                ectAlias = properties["ectAlias"];
 
 
            if (!properties.Contains("ectStrategyPlan"))
            {
                ectStrategyPlan = properties.Create("ectStrategyPlan", ManagedDataType.Text);
                ectStrategyPlan.EnabledForScoping = true;
                ectStrategyPlan.HasMultipleValues = false;
            }
            else
                ectStrategyPlan = properties["ectStrategyPlan"];
            
 
            //TODO: ADD THE NEW FIELDS ADDED TO THE VIEW AFTER FINALIZATION
            //Map the Query Crawled Properties to the Managed Property
            MaptoManagedProperty(context, ectID, ModelName + " Items.ID", ManagedDataType.Text);
            MaptoManagedProperty(context, ectTitle, ModelName + " Items.Name", ManagedDataType.Text);
            MaptoManagedProperty(context, ectStartDate, ModelName + " Items.StartDate", ManagedDataType.DateTime);
            MaptoManagedProperty(context, ectEndDate, ModelName + " Items.EndDate", ManagedDataType.DateTime);
            MaptoManagedProperty(context, ectEntity, ModelName + " Items.Entity", ManagedDataType.Text);
            MaptoManagedProperty(context, ectType, ModelName + " Items.Type", ManagedDataType.Text);
            MaptoManagedProperty(context, ectOrganizationUnit, ModelName + " Items.OrganizationUnit", ManagedDataType.Text);
            MaptoManagedProperty(context, ectOwner, ModelName + " Items.Owner", ManagedDataType.Text);
            MaptoManagedProperty(context, ectAlias, ModelName + " Items.Alias", ManagedDataType.Text);
            MaptoManagedProperty(context, ectStrategyPlan, ModelName + " Items.StrategyPlan", ManagedDataType.Text);
        }
 
        /// <summary>
        /// Maps the external content type columns to a specific Managed Property
        /// </summary>
        /// <param name="context">The Search context to map the the external content type columns within</param>
        /// <param name="managedProperty">The managed property to map the column to</param>
        /// <param name="crawledPropertyName">The crawled property "Column" Name</param>
        /// <param name="DataType"></param>
        private static void MaptoManagedProperty(SearchContext context, ManagedProperty managedProperty, string crawledPropertyName, ManagedDataType DataType)
        {
            SPSecurity.RunWithElevatedPrivileges(() =>
            {
                Schema schema = new Schema(context);
 
                try
                {
                    Category category = schema.AllCategories["Business Data"];
                    var crawledProps = category.QueryCrawledProperties(crawledPropertyName, 1, Guid.NewGuid(), String.Empty, true).Cast<CrawledProperty>();
                    var crawledProp = crawledProps.FirstOrDefault();
                    if (crawledProp != null)
                    {
                        var mappings = managedProperty.GetMappings();
                        mappings.Add(new Mapping(crawledProp.Propset, crawledProp.Name, crawledProp.VariantType, managedProperty.PID));
                        managedProperty.SetMappings(mappings);
                        managedProperty.Update();
                    }
                    else
                    {
                        Console.WriteLine("Query Crawled Property " + crawledPropertyName + " was not found - Mapping faild.");
                    }
                }
                catch (Exception ex)
                {
                    throw new Exception("Faild to map field to Crawled Property \n" + ex.Message);
                }
            });
 
        }
    }
}