Welcome to my blog, stay tunned :
Home

SharePoint 2013

Search Engine not working fine in your App?

Hi,

I know that might sound stupid and it really is but I spent recently a few hours trying to figure out why the Search Engine wasn't working fine when queried from my App.
It turned-out that I forgot to ask the permission to use it. So, I was missing this line in my AppManifest.xml:





While this is very stupid, this little distraction made me realize that the system isn't behaving as expected. Indeed, one could expect that the App Model throws an exception such as "Attempt to perform an unauthorized operation" but it doesn't. It's actually even querying the Search Engine successfully but simply, no results are returned.
It is behaving as if it was applying some kind of security trimming but it's not because the App had full control over the site collection...Morality, just don't forget this line and you should avoid some headaches :).

Happy Coding!



Demo SharePoint-Hosted App showing how to use the REST API (CRUD, micro-blogging, following content, people, search)

Hi,

Update : I've published this App to the online SharePoint Store which facilitate its installation. The App is available here http://office.microsoft.com/en-us/store/rest-api-demo-WA104068939.aspx?q.... However, since the SharePoint Store doesn't permit Tenant Full Control permissions, the App capabilities are a little bit reduced compared to the one published on CodePlex

Creating a micro blog with new hashtags with the REST API

Hi,

Update 04/2013 download my demo App at http://sptoolbasket2013.codeplex.com/

Similarly to my previous post, here is how to create a new hashtag inside of a microblog using the REST API from a SharePoint-Hosted App. Basically, you also have to address the ContentItems

Creating a micro blog with mentions with the REST API

Hi,

Update 04/2013 download my demo App at http://sptoolbasket2013.codeplex.com/

As I've struggled a lot to get this working and since I coudln't find the information anywhere, I've decided to create this blog post that shows how you can create a microblog entry in SharePoint 2013 for the connected user (from a SharePoint-Hosted App) using mentions.

On MSDN, you can find a sample showing how to create a microblog here : http://msdn.microsoft.com/en-us/library/jj822974.aspx. While this is already a good start, there is no mention about posting more complex microblogs and for instance, embedding mentions to other people. If you try just to add @user in the text body, you won't end up with a valid solution :).

So, after a bit of digging, I eventually found out how this works. In the below example, I'm targetting users demo and eni by saying hello to both of them. This is the result when posted:

and here is the JSON data you have to send via a POST query to /_api/social.feed/my/Feed/Post to do that:

{
    'restCreationData': {
        '__metadata': {
            'type': 'SP.Social.SocialRestPostCreationData'
        },
        'ID': null,
        'creationData': {

            '__metadata': {
                'type': 'SP.Social.SocialPostCreationData'
            },
            'Attachment': null,
            'ContentItems': {
                'results': [
                    {
                        '__metadata': {
                            'type': 'SP.Social.SocialDataItem'
                        },
                        'AccountName': 'dc07\\demo',
                        'ItemType': 0,                                        
                        'Uri': null
                    },
                    {
                        '__metadata': {
                            'type': 'SP.Social.SocialDataItem'
                        },
                        'AccountName': 'dc07\\eni',
                        'ItemType': 0,
                        'Uri': null
                    }
                ]
            },
            'ContentText': 'Hello @{0} and hello @{1}',
            'UpdateStatusText': false
        }
    }
}

The magic happens with the ContentItems collection where you have to provide the information about each targetted account which you reference after in the ContentText property by their respective index.

Happy Coding!

Double quotes & single quotes when using the Search REST API of SharePoint 2013 combined with KQL from JavaScript

Hi,

Update 04/2013 download my demo App at http://sptoolbasket2013.codeplex.com/
I've been facing a quite strange issue when using the new search REST API of SharePoint 2013 and you might be facing the same kind of issue so as it took me a while to figure out the problem, I thought it was a good idea to share both the problem and the fix.
Double quotes
First, the problem! When trying to pass double-quotes into the querytext parameter. If you do not use the KQL syntax, you don't have any problem. So, if I transmit this:






/_api/search/query?querytext='test%22'


it's working fine.
If you're using the KQL to say for instance that you want to retrieve only items whose title equals "test". You can do it this way:

/_api/search/query?querytext='Title="test"'


Usually, you'll make sure to encode the full querytext parameter. However, with the KQL syntax, you can use parenthesis and if you repeat the same query than before with parenthesis:


/_api/search/query?querytext='(Title="test")' 
=> encoded value is :/_api/search/query?querytext='%28Title%3D%22item%201%22%29'


it will also work fine. Usually you'll use parenthesis if you want to apply priorities among multiple search criteria, however this is not a must since the above query works also fine but...strangely enough, if you add a double quote in the value, despites of the fact that this is encoded:


/_api/search/query?querytext='(Title="test " double quote")' 
=> encoded value is :::/_api/search/query?querytext='%28Title%3D%22test%20%22%20double%20quote%22%29'


You see that the value test " was encoded to test%20%22 so the double quote is escaped, at least from a web point of view, you will still receive the following error from the service:


Status:500Error:{"error":{"code":"-1, Microsoft.Office.Server.Search.REST.SearchServiceException","message":{"lang":"en-US","value":"We didn't understand your search terms. Make sure they're using proper syntax."}}}


and the reason of that are the parenthesis...If you get rid of them, you won't have that message anymore. So, make sure to use the parenthesis only if you need them...even with multiple search criteria. So, I don't know exactly why the parenthesis have this effect but for sure, they seem to have a weird impact.
Actually, the double quote character is meaningfull for the KQL engine so I think that encoding doesn't change anything and the character is still interpretated somehow by the KQL engine. I tried many different ways of escaping it with %22, \", \x22 in Javascript...and none of them is working.
If I have an item that has this value with " quotes and I try to search using the encoded form, I'll have no error without parenthesis but it won't return any result no matter how I'm encoding the ", I'll have the error described earlier with the parenthesis and it will just work fine with the right results if I replace " by nothing.
Single Quote
Here it's even worste, whatever you're trying to do, the server will crash with an error whether you're working with KQL or even with a basic search term...and the reason of this is because the ' character is enclosing the value of the querytext parameter and that's most probably why it goes mad. So, again, if I just remove the ' character from the search value, it's finding results correctly even when you make an exact match. So if you search for Title="test single quote", items whose the title is test ' single quote are returned...
So, not sure that removing both ' and " from the search terms is the right workaround but it seems to work and you'll probably avoid some headaches.
Note that these problems are only encountered when working with GET and _api/search/query, you won't run into them when using _api/search/postquery because there, you can just use JSON.stringify of the entire request and pass it as the data parameter and that will work like a charm.

Happy Coding!








Using CSOM from an App Part

Hi,

Since I couldn't find any example on the web showing how to use CSOM from an App Part and since it's actually a bit more tricky than using CSOM from the App itself, I thought it could be a good idea to blog about that!

OData CRUD in SharePoint 2013 + Authentication scheme explanation

Hi,

A while ago, I wrote a blog post about OData and recorded a video that shed some light on how to use one of the most unused built-in factory of SharePoint 2010.
This example was explaining how to leverage OData to query data stored in memory but it only demonstrated the "R" of CRUD operations. Before reading further this post, if you're not familiar with hosting WCF Data Services in SharePoint, I'd really recommend you to read my previous post since I won't go into details anymore.
In SharePoint 2010, this was quite difficult to implement because in many scenarios, the authentication providers associated with the web applications are still based on windows integrated security. When coming to data update. As described on MSDN, the article on WCF Services in SharePoint Foundation explains that the authentication scheme (anonymous/ntlm/kerberos) is added by SharePoint Foundation 2010 to the URL (and to the ID of the returned entities) to uniquely identity resources when multiple authentication protocols are enabled for a single webapp...(example anonymous/kerberos).
So far so good but what MSDN doesn't explain is that because of this, only the R of CRUD is working....To illustrate that, I've written a very basic service that is exactly the same between 2010 and 2013 but you'll see that the behavior is slightly different. This demo service allows to handle a list of courses with the support of CRUD operations. I'll share the code a bit later but first let's see how this service reacts on SharePoint 2010.
When calling it from the browser, you get this:

As you can see, the authentication scheme is indeed part of the URL... Therefore, if you try to perform an update from a server-side consumer component, say a console app for which you've added a service reference to /_vti_bin/ODataCRUD.svc, you get the following :

Because as you can see, the URL used to target the object we want to update contains NEGOCIATE...and this results in the highlighted error.
Despites of the fact that the very promising class attribute [ServiceFactoryUsingAuthSchemeInEndpointAddress(UsingAuthSchemeInEndpointAddress = false)] exists, it has, at the time of writing absolutely no effect on the WCF service behavior...meaning that the above error persists.
So, except consuming a custom OData service for update/delete via AJAX (with AJAX, you generate the URLS manually), I never managed to get it working from server-side component consuming the WCF service via a proxy. Of course, working with HttpWebRequest works :













HttpWebRequest req = HttpWebRequest.Create("http://server/_vti_bin/ODataCRUD.svc/Courses(ID)") as HttpWebRequest;
req.Method = "MERGE";
req.Credentials = System.Net.CredentialCache.DefaultCredentials;
req.ContentType = "application/json";
byte[] data = UTF8Encoding.UTF8.GetBytes("{ \"CourseTitle\" : \"Updated Course\" }");
req.ContentLength = data.Length;
Stream DataStream=req.GetRequestStream();
DataStream.Write(data, 0, data.Length);
DataStream.Close();
req.GetResponse();

Leveraging .NET 4 in SharePoint 2013 Series (7) - WCF REST Caching

Hi,
I'll be writing a series of blog posts about leveraging .NET 4 within SharePoint 2013. As you might know already, SharePoint 2013 is now based on this .NET runtime version v4.0.30319 which actually allows developers to benefit from .NET 4 features. You couldn't do that with SharePoint 2010 which was still based on .NET runtime v2.0.50727.

AspNetCacheProfile

WCF 4 brings a new caching mechanism that is built on top of ASP.NET caching. One can easily use that in the context of SharePoint 2013. Say that you want to build a service that returns the latest news of the company on the form of either ATOM XML, either RSS.

Say that these news are the results of an aggregation that takes time to complete, you'd likely want to cache that in order to avoid to perform costly operations whenever someone sends a request to visualize those news.

With WCF 4, it's a piece of cake and the good news is that, it's not much more complicated to integrate that in SharePoint 2013. Indeed, two things must be done:

  • Define cache profiles in the web.config
  • <caching>
    <outputCacheSettings>
      <outputCacheProfiles>
        <add name="CompanyNewsFeed" 
          duration="3600" 
          varyByHeader="Accept"
          varyByParam="" />
      </outputCacheProfiles>
    </outputCacheSettings>
    </caching>
    

    The drawback you have here is that you must define this in the web.config of the SharePoint web app you are targeting. You can't define the caching section in a custom web.config that you would deploy along with your .svc file because this section can only be defined in the root web.config. So, you'd either see with your administrators, either write some piece of code with SPWebConfigModification (automatic but sometimes dangerous), either you document the change and let the administrators cope with it.
    The best would have been of course to deploy a custom web.config file along with your service so that nothing needed to be changed but I couldn't succeed unless I convert the ISAPI (_vti_bin) virtual directory as an application but that's not something you want to do in SharePoint, right?.
    Anyway, above I'm just declaring a cache profile that retains data during 1 hour. So, any subsequent request after the first request will pull the data from the cache unless the Accept header changes. In the case of RSS/ATOM, this will change according to the choice you make.

  • Decorate your service methods with the AspNetCacheProfile attribute and specify the cache profile you want to use
  • For sake of simplicity, I'm not going to follow the WCF best pracitces (Contract/Service/DataContract) and I'm not making any error handling to make it short and focus on the caching story. I'm sure you'll forgive me :). So, I've assembled everything in those two following classes:

    //Data object
    public class News
    {
        public string Title;
        public string Body;
        public string Author;
        public DateTime PublishedDate;
        public News() { }
    }
    
    //Service Attributes
    [ServiceContract]
    [ServiceKnownType(typeof(Atom10FeedFormatter))]
    [ServiceKnownType(typeof(Rss20FeedFormatter))]
    [AspNetCompatibilityRequirements(
      RequirementsMode = AspNetCompatibilityRequirementsMode.Required)]    
    [System.ServiceModel.ServiceBehavior(
      IncludeExceptionDetailInFaults = true)]
    public class NewsService 
    {
      [OperationContract]
      //This is where you make the link to the cache profile you defined earlier in the web.config
      [AspNetCacheProfile("CompanyNewsFeed")]        
      [WebGet(UriTemplate = "/news/feed/{format}")]
      public SyndicationFeedFormatter GetCompanyNews(string format)
      {
        //Here I'm just building a dummy list of news but
        //in the real world, this could be a heavy operation
        List<News> DummyNews = new List<News>();
        for (int i = 0; i < 4; i++)
        {
          DummyNews.Add(new News
          {
             Title = string.Concat("News ",i),
             Body = string.Concat("This is the fake news number ",i),                    
             PublishedDate = DateTime.Now
          });
        }
        //Building the returned feed    
        SyndicationFeed ReturnedNews = new SyndicationFeed            
        {
          Title = new TextSyndicationContent("Company News"),
          Description = new TextSyndicationContent("Company Feed"),
          Items = from NewsItem in DummyNews
                  select new SyndicationItem
                  {
                    Title=new TextSyndicationContent(NewsItem.Title),
                    Content=new TextSyndicationContent(NewsItem.Body),                                                      
                    PublishDate=NewsItem.PublishedDate
                  }
        };
    
        if(format == "atom")
          return ReturnedNews.GetAtom10Formatter();
        else
          return ReturnedNews.GetRss20Formatter();
        }
    }
    
    

So, the result looks like this:

and if you look at the date & time, it won't change during 1 hour unless you decide to change the format (atom or rss). Last thing I should add even if it's obvious, since this WCF caching capability relies on ASP.Net caching, it's not distributed. This means that in the context of SharePoint where you usually have several WFE, each WFE will have its own cache state. This could lead to troubles for more sensitive data if the load balancer redirects your request randomly to any WFE, you could have different states unless you're using sticky sessions.

Happy Coding!

Tip : Using Device Channels of SharePoint 2013 using CloudShare

Hi,

I've been trying to use the device channels using a CloudShare environment and I've been facing a silly problem that made me lose a bit of time. You could probably encounter the same issue so that's why I'm writing this note.

If you're using the preconfigured SharePoint 2013 Server image, you will likely enable web access so that you can test the device channels with actual mobile devices. If you end-up like me with no error but just SharePoint that keeps ignoring the configuration and launches the default channel whatever device you are using fror browsing, then, you'll be in the same situation than the one I was experiencing :).

Although that might not sound obvious, this is actually due to an AAM issue. In the CloudShare FAQ, they explain that you're supposed to add this AAM if you encounter problems to connect to a SharePoint 2010 environment

In my case, with their default 2013 image, I didn't face any problem other than the fact that SharePoint was not launching the right device channel.

Anyway, I decided to add my vanity URL as an extra AAM public URL and the magic happened! This drove me nuts for a while so that's why I decided to share this tip :)

Happy Coding!

Leveraging AppFabric for custom caching in SharePoint 2013

Hi,

[Update] the scenario depicted in this blog post is not supported. I wrote it in 2012 with the public beta of SharePoint and I had doubts about supportability. Microsoft confirmed 1 year later that this is indeed not supported to use SharPoint's AppFabric for custom caching. You should create another cluster instance...

As you might have noticed, AppFabric is now one of the pre-requisites when installing SharePoint 2013. SharePoint makes use of it for workflows and activity feeds.

When I noticed that AppFabric was part of the pre-requisites, I directly thought of trying to figure out whether SharePoint would give developers new APIs to leverage AppFabric's distributed cache capabilities. As SharePoint developers, on the 2007/2010 versions, when it came to caching, we always had to either use ASP.NET cache (mostly inproc), either rely on third parties such as NCache or AppFabric but we had to install/configure those ourselves as they were not integrated to SharePoint.

Unfortunately, at the time of writing, Microsoft did not release any public API enabling us to benefit from a Farm-Wide caching system. Indeed, the classes leveraging AppFabric are all marked internal, so there is no public API. However, the good news is that we know AppFabric will be deployed on every SharePoint setup, therefore, we can start using it the context of SharePoint using the AppFabric client DLLS.

I built a wrapper that you can download here http://sptoolbasket2013.codeplex.com/ whose the purpose is to ease the use of the AppFabric layer in the context of SharePoint 2013. I should add that this is currently at an experimental stage since I've never used that for a real project and I don't even know whether this wrapper would be supported by Microsoft :).

I'll describe hereafter all the testings I made with this wrapper & AppFabric in SharePoint 2013.