Wednesday 26 April 2017

“Access-Control-Allow-Origin” – A saviour for Cross Domain calls if used wisely!

In SharePoint 2013, We were recently working with integrating SignalR(+Owin) new version 2.2.1.0 for achieving one of the customers requirement. During the implementation, it invoked the need for having the cross domain connection.

In this article, I won’t go through details regarding SignalR implementation as it would diverge from the current topic. You may find some nice articles here which explains how to use SignalR and how to use CORS in SignalR. I am sure if you follow steps and articles diligently you won’t face any problem implementing this.

The Problem:

In our case we got below error in IE when we were trying to establish cross domain hub connection from client side using SignalR CORS:

SEC7128: Multiple Access-Control-Allow-Origin headers are not allowed for CORS response.


Before going further let’s first assume, we have 3 different intranet web applications in SharePoint 2013 environment and Signalr hub is configured on all these 3 applications.

Web application 1: http://intra.abc.com
Web application 2: http://intra.pqr.com
Web application 3: http://intra.xyz.com


During the investigation of the above error it was found that there was already an entry available for “Access-Control-Allow-Origin” custom header in web.config (3rd Web application) and it was like this

<httpProtocol>
<customHeaders>
<add name="Access-Control-Allow-Origin" value="http://intra.abc.com" />
<add name="Access-Control-Request-Method" value="POST,GET,OPTIONS,PUT,DELETE" />
<add name="Access-Control-Request-Headers" value="Content-Type, Authorization, Accept" />
<add name="Access-Control-Allow-Credentials" value="true" />


And hence SignalR code (placed on 2nd application) was failing from client side when it tried to establish a cross domain hub connection with 3rd application.


Why an error?

SignalR CORS code adds the required header and value for “Access-Control-Allow-Origin” in response headers and this was creating the duplicate entries in response (one from web.config and one from SignalR CORS code) and hence there was an error. Please keep this in mind, there must be only one entry of “Access-Control-Allow-Origin” in response headers.


Investigation

This is the statement from SignalR startup class which adds the allow origin entry to the response header:

map.UseCors(CorsOptions.AllowAll);

Some blogs on internet have suggested to comment out this statement if web.config already have the “Access-Control-Allow-Origin” header. But in our case the web application address we wanted to have in the web.config was for 2nd whereas it was for 1st application in config.

During troubleshooting, we found the real problem with this header “Access-Control-Allow-Origin”. This header accepts ONLY ONE ORIGIN value and it doesn’t accept multiple domain addresses added with space separated, comma separated or “*” at all.

These are the 3 cases which DOESN’T WORK practically with “Access-Control-Allow-Origin” no matter what W3 or MicroSoft standards say:



  • <add name="Access-Control-Allow-Origin" value="*" />


Some posts on internet said “*” works but if we have to use “*” then please keep "Access-Control-Allow-Credentials" as "false". But we tried that one as well and it doesn’t work.

So only working option remaining with “Access-Control-Allow-Origin” was using single domain address for origin.

In our case we were making calls/connections with 3rd application. Means we were required to have two domain addresses (2nd application for SignalR and 1st application for other old feature) to be registered in allow origin header but it seemed this is not possible from web.config of 3rd application. If we just keep one address then it will disable the other 
application feature.

Solution:

So to overcome the situation, after investigation and googling, we came up with one solution. The solution proved pretty helpful for both SignalR + Old feature.

Step 1: We decided to get the ORIGIN address dynamically instead of hardcoding it in 
web.config of 3rd application. So we just moved it to Global.asax file (in 3rd application) and 
then removed entries from web.config. We can may be filter coming requests in 
“Application_BeginRequest” event to allow only 1st and 2nd application.

Global.asax:

<%@ Assembly Name="Microsoft.SharePoint"%>
<%@ Application Language="C#" Inherits="Microsoft.SharePoint.ApplicationRuntime.SPHttpApplication" %>

<script runat="server">
     void Application_BeginRequest(object sender, EventArgs e) {       
                  Response.Headers.Remove("Access-Control-Allow-Origin");
Response.AddHeader("Access-Control-Allow-Origin", Request.UrlReferrer.GetLeftPart(UriPartial.Authority));

                  Response.Headers.Remove("Access-Control-Allow-Credentials");
                  Response.AddHeader("Access-Control-Allow-Credentials", "true");

                  Response.Headers.Remove("Access-Control-Allow-Methods");
                  Response.AddHeader("Access-Control-Allow-Methods", "GET, POST, PUT, DELETE, OPTIONS");
     }
</script>


Step 2: After this we commented out this statement from SignalR startup code which was adding a allow origin header. So now it will no longer add any allow origin header.

map.UseCors(CorsOptions.AllowAll);

This solution working like a charm so far for every cross domain request coming towards 3rd application (whether from 2nd application for SignalR OR from 1st application for old feature) and both features on 1st application and 2nd application worked perfect.


Happy Learning! J

Tuesday 25 April 2017

PostTitle lookup field is blank in blog site Comments list

In SharePoint 2013 OnPremise, recently we faced one strange problem with blog site Comments list. The problem was, hidden lookup field “PostTitle” in Comments list was not storing respective Post title reference after adding any comment against that Post. And hence Comments were not visible below such Posts.


When googled for this issue, we found most possible solutions as “recycling app pool”.

We tried many different solutions to resolve it but none of them helped. To name a few, we tried following solutions:

1. Checked app pool recycling. But it was already set to run on every morning.
2. Tried clearing distributed cache
3. Tried adding PostTitle field explicitly in Comments list content type (Ideally "PostTitle" field is hidden)
4. Tried repairing "PostTitle" lookup column for connecting to a Comments list again as mentioned in below thread  http://www.ilikesharepoint.de/2013/12/sharepoint-repair-lookup-columns-which-are-not-connected-to-a-list/

After long time banging the head against the issue, one of our team member have luckily observed one strange behavior during troubleshooting.

The issue was with Querystring parameter (“id”) for post.aspx url when opening any particular post. Querystring parameter “id” was in lower case.


When tried it with uppercase means making it as “ID”, the issue with comments disappeared.




The actual issue was with one Managed Property “SitePath” which was forming and returning “id” in lower case in a Post url inside a display template (for “Content By Search Webpart”).

We replaced lower case “id” with upper case “ID” in display template and uploaded it in gallery. Now the results in CBS webpart were showing Post url with upper case “ID”.

All comments after redirection from this new Post url were showing “Post Title” field correctly in the Comments list and hence those Comments (for which “Post Title” field was not empty) were visible correctly below the particular Post. J











We were not able to trace the root cause for this behavior. But it looks like SharePoint internally (while forming a query to fetch PostTitle reference) considers Querystring valid only if it is mentioned as “ID” in Post url.

I hope it will definitely save somebody’s day, week or even a month!

Happy Learning! J

Showing Blog Post’s Comments and Likes Count in Content Search webpart

In SharePoint 2013 OnPremise, we had one requirement recently to show Comments and Likes count with each blog post in Content search webpart. We have two different Sharepoint Web Applications (different domains with different AD), and we were required to show this web part on both these applications front page. Also one important thing to note is that we have only one “My site” for both the applications.

Based on customer requirement, first we implemented this only for first web application.

During this implementation, we found that Likes count implementation was pretty straight forward (using “LikesCount” managed property, in display template, mapped to “ows_LikesCount” crawled property). The continuous or incremental crawl updates this property correctly (for continuous crawl it takes around 3 mins interval to update).

But we found that this wasn’t the case with Comments crawled property “ows_NumComments”. Though we created a managed property for it, say NumComments, it still wasn’t updating the comments counter after continuous or incremental crawl. It requires a Full crawl all the time to update the counter. We also found this issue mentioned here.

Comments counter (# Comments) column is not created as Site Column. It is local in Posts library.

To set it in Full Crawl, we can set “My Site” in new Content Source and set it to run every 3 or “x” mins.

But setting a Full crawl for updating only one field wasn’t our choice that time (considering performance overhead for whole application) so we decided to fetch Comments count with the help of REST calls from display template. For this to work with cross site (“My Site” was in a different domain), we had to add few entries in web.config of “My Site” to allow cross domain requests from single calling domain:
<configuration>
<system.webServer>
<httpProtocol>
<customHeaders>



And put following entries inside the “customHeaders” tag:
<add name="Access-Control-Allow-Origin" value="<http://calling-domain>" />
<add name="Access-Control-Request-Method" value="GET, POST, HEAD, OPTIONS" />
<add name="Access-Control-Request-Headers" value="Content-Type, Authorization" />
<add name="Access-Control-Allow-Credentials" value="true" />

This worked great for only single domain for few months. But then there came the new request from Customer to add the same Content search web part on another application front page as well. Now for this cross site REST calls to work with 2 domains (or host-headers) we simply replaced value for “Access-Control-Allow-Origin” to “*” as mentioned in this MSDN blog

<add name="Access-Control-Allow-Origin" value="*" />

But after many unsuccessful attempts, It was concluded that “*” doesn’t work with SharePoint environment, it allows only single domain address for allowing cross site access.

So we decided to again give it a thought on our first approach of using Managed Property for Comments count for only this second application. As in this second application, blog posts are only coming from “My Site” so there was no need to do a Full crawl for the whole second application and it was needed for only “My Site”. So it was a considerable case for Full crawl as we have to do it only for “My Site” and not for those main sites with huge contents and libraries.

We created a new managed property “NumComments” for “ows_NumComments” crawled property and then created a new Content Source which had Full crawl scheduled to run for every ‘x’ minutes interval or so. We have assigned only “My Site” reference in new content source and scheduled to run it for every 30 mins. Earlier we thought it would be a performance overhead, but since we have to do full crawl for only “My Site”, it shouldn’t be a problem.

We kept the approach of using the REST call for first application as it is as there were posts from both the first application and “My Site” as well. So it was not convenient to allow Full crawl for first application because of huge contents.

Conclusion:

So we had to use this mixed approach for fulfilling our requirement (especially for showing Comments count) for both applications.

For First application, we used REST call approach for fetching comments count from Comments list (by adding above Access-Control-Allow-Origin entries in web.config in “My Site”, which works well for single domain)

For Second application, we used Full crawl technique (for only “My Site” set to full crawl) where 30 mins crawling schedule is set for updating NumComments managed property and then shows correct number in webpart.

I hope it will help someone struggling to overcome the similar situation.


Happy Learning! J

Display Default Value in People Picker Control in List New Form

In SharePoint 2010 OnPremise, our Customer came up with one new small requirement to show the default value for people picker control in Custom List OOB New Form. As it was for a simple OOB new form for a Custom List, we analyzed the options accordingly in 2010.

  • If you want to do it in InfoPath Form then there is very nice article presented here by Laura Rogers 
  • If you want to do it in SharePoint 2013 then please refer this nice article which describes how to do it with JSLink.


In our case it was SharePoint 2010 and a Custom List OOB New Form. Actually the problem is we cannot set default values for People Picker control from design time so we had two options basically, one with the help of Custom JQuery Injection or second with the help of third party SharePoint Add-On

We observed following Pros and Cons for the SharePoint Add-On:

Pros:
  • It can be very useful for setting default values during design time for lookup or people picker fields.

Cons:
  • It is not free. Free trial period is only for 30 days
  • Setup will install a new wsp to the farm so might be a problem with maintenance or updates in future


So we decided to go ahead with the first option instead i,e. Custom JQuery Injection approach, that was more simple and quick to implement.
For this we edited the default new form for the custom list, then added one Content editor webpart in the form so as to attach a custom .js file with it (Remember there was no JSLink in SP 2010). Now is the time to create actual JQuery file which will inject default value into the People Picker control.
In our case Customer wanted to show a particular person’s name in the people picker and not the logged in person as default. If you have the requirement to show the logged in user as default then you may refer this link with some little variation and use of “jquery.SPServices” libraries.


We created following JQuery Injection script for our purpose. Here is the custom JQuery script named “setdefaultvalue.js” (I hope you have “jquery.min.js” already available in “_layouts/js” folder in 14 hive, if not you can download it from here and place it in 14 hive path)

setdefaultvalue.js
<script src="/_layouts/js/jquery.min.js"></script>
<script type="text/javascript">
$(document).ready(function () {

user = "Domain\\Username"; // Set “Domain\\Username” as default user

// Set “user” as default value in the required people picker control
var counter = 0;
$('div[title="People Picker"]').each(function() {

    if (counter == 1) {
        $(this).html(user);
    }
    counter++;   
});

// Call the click event of “Check names” button explicitly
counter = 0;
$('a[title="Check Names"]').each(function() {
   
    if (counter == 1) {
            $(this).click();
    }
    counter++;   
});
});
</script>

Note: In this script we used “counter” variable to check at which position our People Picker control appears in the form (required only in case of multiple People Picker control on form).

And then uploaded this file in that site collection’s Style Library




And then referred it in the Content Editor webpart property “Content Link” as below




Just make sure of one thing, keep the Zone index of the content editor webpart later than the main form webpart. Suppose if main form webpart zone index is 1 then keep the zone index of content editor webpart as 2. This will make sure that all the controls in the form are loaded first and then JQuery script will fire.


After all these steps, default value started showing up correctly in the desired People Picker control on the list new form.



Happy Learning! J

NoteField values not getting crawled in SharePoint 2013

Yesterday, we were working on one new feature in our project where there were new Content Types and Site Columns. Amongst those site columns one field was of type “Note” (Type="Note").

Customer requirement was that we should show this “Note” field in Search Results webpart in second row for each item. So we performed regular procedure for showing this field in search results web part. We created few pages filled with data for all fields(including this Note field) and then ran full crawl. Within some time, managed properties were created under Search  Schema. But they were not set as “Queryable”, “Searchable”, “Retrieveable”,“Refinable”. So we applied those settings and then ran the full crawl again.

After some time when the crawl was done, we checked values from SharePoint Search Query Tool on Server. It showed values for all fields except this “Note” Field. We checked all our configurations again but they were correct. We searched a lot for this issue on google but couldn’t find satisfactory answer for this issue.

Then while performing research, we added few other “Note” columns for testing with different settings and after crawling we observed that in Elements.xml, this setting “UnlimitedLengthInDocumentLibrary” was “False” for this particular Note field where values were showing in search tool and it was “True” for other Note fields where values were missing. This setting, if “True”, actually helps us to increase text length to unlimited characters. And it seems this is the reason why search crawler doesn’t consider crawling it to avoid bad performance and space constraints. We informed it our customer and changed requirement to keep only 255 characters length in “Note” field column.

Ideally Note Field allows only 255 characters and if you want to really allow more text in Note Field then you need to enable this property.

UnlimitedLengthInDocumentLibrary="TRUE"


You can see this setting from Site Settings > Site Columns >> Edit any note field.


Usually in web parts like Search results, CBS etc. we show columns like Title, Image, Url, description etc. and we shouldn’t show big text columns in results. Such columns should be available only in individual Pages (for ex. Content area, Note field etc).

I hope it will help someone out there!

Happy Learning! J

The SOAP message cannot be parsed.

In SharePoint 2010, I had customer requirement to add new fields to the existing InfoPath form for one Custom List with large data. For this I started editing InfoPath form with the tool MS InfoPath Designer 2010 (which comes with MS Office).

To edit the form, you can click on “Customize Form” ribbon button inside the “List” Tab, It will open form in InfoPath designer tool.


I added some new fields to the form and tried to publish the form using either of the below marked actions.



But in design checker it thrown exception “Control binding is not supported” for 3 different calculated value fields. I managed to resolve this issue with the help of this blog.

Again I clicked on “Quick Publish” to publish the form, this time there were no more these above 3 exceptions and now it took long time but publishing failed with following error.

The publish operation could not be completed. It cannot be determined if the form template was successfully published. Try publishing the form template again, or change the list settings to use the default SharePoint form.
-          The SOAP message cannot be parsed.



The exception was not at all leading me to the right direction. When checked the ULS logs there was no specific and helpful error found. There was also no problem with permissions to the list. User was having Full Control.

After googling for some time, following solutions were found for this particular error:

  • Increased the executionTimeout value in web.config and set it to 3600 seconds. But in my case it was already set in web.config <httpRuntime maxRequestLength=”2097151″ executionTimeout=”3600″ />

  • Tried increasing data connection timeout values (to 1 hr) from central administrator for InfoPath Forms Services



  • Checked whether site collection feature “SharePoint Server Enterprise Site Collection Features” is activated. In my case this feature was already activated.


  • Tried updating missing bindings for some dropdown controls
  • Tried with removing all calculated value fields from InfoPath form.
  • Tried changing name of the template.xsn and ID
  • Traced InfoPath form publishing in Fiddler and tried to get answers for exception in response i,e.“Request timed out” with status as HTTP 302. All answers lead to the same above resolutions which were already tried.
  • Tried restoring site collection again
  • Tried with new custom list and for it InfoPath forms publishing was working correctly
  • Tried importing changed/saved template by SharePoint designer and then renamed it. 
  • But it showed the following error when I clicked “Add new item” link
Unfortunately no single option from above helped me.

But after trying all these above options, again I given a thought on “Request timed out” exception from Fiddler. Even though I tried increasing all timeout values above, I was googling more on this issue. After spending lot of time on this, found that few of the blogs mentioned about huge lists cases means list with large number of items and it was true for this list as well. It was having around 4500 items.

So I decided to reduce items batch wise (max 100 at one batch) from the list and let it go to the recycle bin. I was also checking form publishing simultaneously during this process. It was still failing. But as it was gradually decreasing the number of items in the list, it finally reached that level when InfoPath Form Publishing was successful J

The magic number was around 1500 items. When I checked it multiple times, I found that above this level, form was not able to publish. The reason for this may be because Form was containing 25 calculated values and hence publishing was failing for large items.

Now as the reason was known, it was needed (for publishing updated form with new changes) to either move items above 1500 to similar duplicate list and then again move those back OR Delete those items and then restore it back from the recycle bin. That’s It!



Note: FYI, during this process, I faced this different exception as well during publishing
The SharePoint list form can't be customized with InfoPath because fields of an unsupported data type are marked as required, or because fields are corrupted. In SharePoint, try deleting the columns or editing the column properties to remove the required attribute.
-          ABC (Lookup)

The solution for this was I had to remove columns from the SharePoint list and then refreshed the InfoPath Form. Then deleted dropdown list control from Form and then again added it with binding to the Lookup list column. Publishing was successful.



Happy Learning! J