G:\PleskVhosts\stockarbiter.com\httpdocs\karimkameka/wp-content/themes/naya-lite/lib/css/custom-css.css is not writeable.
Copy the generated css from the text area below and paste it in the file above.

Category: Azure

Mobile Services vs. Al a Carte Services

The other day I was chatting with my colleague Nick Pinheiro about possible architectures of mobile solutions with Azure.  As you may or may not know Windows Azure is the Microsoft Cloud.  They provide a bevy of services from Website hosting to API creation, data storage, machine learning and much more.  Nick and I are in the process of building separate mobile applications and we often spit-ball ideas about the architecture of our applications and how we handle certain scenarios as they come up with our applications.   to set the basis of what we have been discussing, Nick’s application started off as a web application (actually, it’s a Facebook app in which Nick  was honored as having one of the top Facebook apps on the planet!).  He is now in the process of making his app a native mobile application.  My application comes from the perspective of building a brand new Azure Mobile Services application.   Both approaches have their own unique pros and cons.   So this lead us to wonder what might be the pros and cons of using Windows Azure Mobile Services vs. building a mobile application using the individual (A la Carte) services offered by Windows Azure.  So that is what we will be discussing in this post.


Submitting a Search by Pressing Enter with #AngularJS

Well I am well on my next chapter of AngularJS for my Stock application (Mogul Match Stocks) which follows the tenets of Rule #1, by Phil Town which I mentioned in my earlier post. BTW just to share a quick win with you, Rule #1 told be to buy Twitter before the huge run up on last Friday,  I was up 20% as of this writing!  Anyway back to the point of this piece.  In building the application I  was in need of a search function which would allow my users to search for a stock by typing in the first few letters of the company name or the first few letters of the ticker symbol.  In the design of site, as you can see in the screenshot below, I was in need of a search box in a few places.  AngularJS handles this scenario really well with directives!  After building out by site with the search boxes as I wanted I began to test.  During my testing of the solution I found my self hitting the enter key when I was done entering the letters for my query.  Of course, I had not told the form to submit for the search box  but only to submit when the little search icon was clicked.  This became annoying and I figured that I would implement as my users might want to do the same.



Creating an Instagram Subscription API with Azure Mobile Services

Windows Azure Mobile Services has been an exciting tool I’ve been using to develop my mobile applications.  One of the key things when building several kinds of mobile applications is integration with several of the big social networks.  Instagram becoming the way to share photos was one of those integration points I needed in my application.  Instagram provides  a “real-time” API service which uses the PubSubHub model,  details about the real-time API can be found on Instagram’s developer site.

My application required the need to have images from Instagram show up in a user’s feed as it was available.  At first I thought about just polling the Instagram feed’s using a Scheduled job (which is a part of the mobile service) however this solution did not scale well due to the limited number of SQL Azure connections in the service.  This lead me to the real-time API to have Instagram tell me when to update my application.  As a part of this I also wanted the user’s to have the ability to specify the tag for which they would like to follow, as well as the username (in many cases their own) they would like to follow as well.  Given these requirements I needed a generic enough API which would allow my application to subscribe to these objects in Instagram and then have them populate in my application by storing in a table.

So enough background let’s get to the code.


Sending a File Attachment using SendGrid

I was looking to automate the process of downloading an Excel Spreadsheet from a web site and then sending that spreadsheet to one of my many inboxes. This process needed to run hourly. I have also been learning all I can about Windows Azure Mobile Services (WAMS) provided by Azure which come “free”! (I like free, especially when I am learning and dabbling, but especially when I have some real code to get out). Anyway, given my new found knowledge for WAMS and the SendGrid service which is an add on to Azure I thought maybe I could automate this process quickly and freely. So begins my journey. As a part of the AMS service there is a scheduler service, given I need this hourly this is the natural choice.

In developing the script and my newbie status to node.js, I was at a loss for how to get the file. In my case the file was hosted on a third-party web site with a VERY accessible URL Smile. I just wanted to download the file and attach it to my email. Enter SendGrid, SendGrid is a powerful email service which provides a RESTful email API allowing me to send emails from the cloud without having to worry about having an SMTP server at the ready! Awesome since I don’t have an SMTP server hanging around! So begins the adventure.

I look up the sample from the Windows Azure documentation and for a basic send it works great however, in looking up the SendGrid API it turns out there is an “Email” object which will provide me more control over what is sent and the ability to add attachments. One of which methods allows me to set the URL for which the attachment resides and SendGrid takes care of all the leg work regarding downloading the file and attaching it to the email! (AWESOME!!)

Below is what I ended up with after fighting through some syntax errors! All in all this now works to a T and I get my hourly emails for processing like clockwork!

I even used (yet another “free” service) Send to DropBox to send the file to my dropbox account for easy filing!


var sg = require('sendgrid'); 

function myjob_hourly() { 
var theUrl = ""; 
var theemail = ""; 
var currentDate = new Date(); 
var subject = ""; 

var sendgrid = new sg.SendGrid('username', 'password'); 
var email = new sg.Email ({ 
to: theemail, 
from: theemail, 
subject: subject, 
text: '' 
email.addFile({filename: "myfile.xlsx", 
sendgrid.send(email, function(success, message) { 
if (!success) { 

Migrating from SharePoint 2010 to O365 in the Real World

I’m working with a customer who has 7000+ sites hosted in an On-Premise SharePoint 2010 farm.  The farm has about 78 custom WSPs deployed throughout the farm. We had very little source code from each of the WSPs as this customer was a perfect example of how not to do SharePoint Application development.  We had solutions raging from the deployment of custom CSS to others involving web services deployed to the ISAPI folder on the WFEs.  Needless to say many recommended practices were not followed.

The following steps were used to evaluate the farm.  These are necessary in the Planning stages to determine what will/can be moved.

  1. Capture the Inventory of your sites and the Information Architecture. The inventory should capture the following:
    1. List of Web applications
    2. List of Site Collections
    3. List of Sites
    4. List of Features and their Associated Site Collections/Sites (Important for STEP 2)
    5. Size of each site
    6. Last Updated Date for the site items. (Will help identify dead sites)
  2. Capture the inventory of your Solution Packages (WSPs) and download.
  3. Use a tool like SPCAF http://www.spcaf.com/ to evaluate.  SPCAF will provide a report for migration to the cloud and thus the recommendation to use the tool.   (This tool was invaluable in getting the evaluation of the WSPs done in a timely manner.)  We classified each WSP into one of the following buckets, the timelines are based on our teams capacity to convert the items.
    1. Low (2-4 weeks) – Converting simple SharePoint artifacts like web parts, Content Types/Branding to the new model.
    2. Medium (4-6 Weeks) – Solutions with complex deployments/configurations.  This would include things like rewriting a Visual Studio Workflow and or converting Event receivers.  This could also include those solutions with numerous Web parts and other artifacts which need to be converted.
    3. Medium-High (6-10 Weeks) – Solutions which will need some re-architecting and/or alternate configurations due to artifacts and/or services which are deployed directly to the SharePoint server.  An Example here would be a web service which was deployed to the ISAPI folder as this will not be allowed in O365  we will need an alternate place for deploying this solution and the ability for the sites leveraging these solutions to access the new deployment location (preferably Azure).
    4. High (10+ Weeks) – These should be full on rewrite which will require some requirements gathering and building of a project backlog.  They will require architecture design of the solution/app and would typically be a large solution/project with multiple components
  4. Once the WSPs are classified we evaluated the Sites from Step 1 and matched those up with the sites which were using the custom features in the WSPs from Step 3.  This gives us a report on which sites contain customizations and which ones do not.  We now have an idea on which sites will be trouble Winking smile
  5. We now needed to determine which sites get migrated first.  We created a set of buckets for Site Classification:
    1. Low – No Customizations Required.  These sites were migrated first.
    2. Medium – Small customizations which could be completed quickly such as the Low & Medium solutions from Step 3 above.
    3. High – Any customizations which will take some time to migrate.
  6. We them formulated the schedule for migration based on the above information.  We are using a migration tool to do the pushing of the data to O365 as part of our Testing and eventually our Execute stage.


SharePoint Artifact Migration Recommendations

Once we had our Inventory we classified each of the custom artifacts with a plan of attack for migrating these items to the Cloud.  Because Solution Packages (WSPs) are not allowed in O365.  These packages and their associated SharePoint Features will need to be broken down and converted to the SharePoint App Model where applicable.  Many of the guidelines for converting these solutions reside as part of the O365 Patterns and Practices recommendations.  Below are he guidelines we used for the various artifacts:

  1. Provisioning/Web Templates/Site Templates – Any web templates and site templates as well as the process for provisioning new sites should leverage the guidelines from the O365 Patterns and Practices team.
  2. Branding & Layout –Contains the deployment of CSS, JavaScript, master pages, page layouts and the associated content types.  The migration of these items should follow the guidance from the O365 Patterns and Practices team.  (http://www.microsoft.com/en-us/download/details.aspx?id=42030)
  3. Workflows  –  Custom workflows created in Visual Studio which use code behind to complete tasks. Workflows which were built with Visual Studio will need to be converted to the SharePoint 2013 declarative workflow model.  These workflows will need to be rebuilt from scratch.
  4. Content Types – These were custom content types which were deployed as a part of a WSP Solution. Content Types should be evaluated to determine if they can reside in a Content Type Hub.  This will provide the content types to all Site Collections subscribing to that hub and provide a safe location for management of the enterprise content types which were previously deployed as WSP solutions. Secondly, Content Types can be deployed using CSOM and the App Model APIs available for use in the O365.
  5. List Templates – custom list templates and in some cases their associated implementations which are deployed as part of a WSP Solution.  List Templates can be deployed via the CSOM APIs and by leveraging the Application model APIs.
  6. Web parts –  Custom web parts which are deployed as WSPs. All Web parts will need to be recreated as SharePoint 2013 App Parts.  This will bring these solutions into the App Model configuration and allow for the deployment of these Web Parts into the Cloud.
  7. InfoPath Forms – InfoPath forms which contained code behind and or required full trust in order to work.  Administrator/Full Trust Info Path forms will should be rewritten as SharePoint Hosted Applications.  This will allow the required flexibility needed for the forms while allowing the use of the form on multiple sites as needed through the SharePoint 2013 App Model.
  8. WCF/Web Services – WCF Services which were deployed to the Web Front End servers which contained calls to the SharePoint APIs.  These services need to be redesigned/built as SharePoint Provider Hosted Apps.
  9. Web.Config Modifications – Changes to the Web.config on the Web Front Ends which were needed to enable functionality by custom code. These modifications will need to be identified and the associated solutions which required them will need to be re-factored into a supported model.  Most likely these solutions will become Provider Hosted Applications for which the required customizations can be made without deploying anything to the SharePoint servers.