A wrap-up of my April travel

Wow… what a month of April.  If you have been following me on Twitter or Foursquare, you know that I’ve done a bit of travelling and a bit of speaking this past month.  This is the wrap-up of those events, and a look ahead at my future schedule.

First Stop… Boston!

For the first week of April, I spent three days in Boston at the Telerik home office and in a private company event.  I got to present to some guests at this event, and spend some time with the most of the Telerik DevTools Developer Evangelists.  It was a good couple of days to recharge my batteries on Telerik products and work with my team.  I met a number of really great people in our technical community that I really need to ensure I keep in touch with.  I can’t say too much more about this event, but you will be seeing the fruits of it in the weeks and months ahead.

April 9 – Pittsburgh.Net

On the 9th, I drove across the commonwealth of Pennsylvania to Pittsburgh to speak to the Pittsburgh.Net user group.  This group meets in the Microsoft office that resides in the shadows of Heinz Field (home of the Steelers) and PNC Park (home of the Pirates).  The 60 developers in attendance gave me their ears for 90 minutes and I gave the first run of my One ASP.Net presentation.  This presentation was going to run a bit longer than I expected, and I ended up cutting some of my demos in favor of pre-written code.

The group enjoyed my presentation and wrote as such on their Meetup website.  Thank you Pittsburgh.Net for your hospitality!

April 17 – Philly.Net

It was time for some home cooking, and I had a session at my home user group – Philly.Net.   This group always turns out for good content, and this evening was no exception.  120 people registered and it was standing room only for this evening.  My buddy John Petersen spoke for the first half of the evening on the cool new features in ASP.Net 2012.2 and then I got to give my One ASP.Net talk in the remaining 45 minutes.  I cut my talk down even further from the Pittsburgh session, and managed to squeeze in all of the topics I wanted to cover.

It is always great to speak to this group, not just because I know so many of them, but because they REALLY know their stuff.  The questions from the Philadelphia crew are always very deep, and they’re not afraid to tell you their opinion.  I received a suggestion during my session to assemble some of the content that I had with some sample code, scripts, and how-tos into a unified resource that developers and project managers could reference when making decisions about employing hybrid-ASP.Net techniques.  This is a brilliant idea, and you will see the results of that effort in the next week or two.

April 23 – CapArea.Net

My final visit of the month was in the Washington DC suburbs for the CapArea.Net user group.  This was a very curious group that was just coming off of a weekend with their Code Camp.  About 30 people showed up for my talk about One ASP.Net and this time I had a full two hours to present the content.  I put all of my demos into this one, and still came up short on time.  I guess its better to have more content and not need it, than to run out of things to talk about…

This group was very interested in Telerik and our products.  I was afraid that I turned the end of the session into a bit of a sales pitch, but the group seemed very interested, and for that I thank them.

Home, for now…

After those four user groups, my family is ready for me to be home for a bit.  This won’t last too long, as I have visits on the books for Arizona, New Orleans, Northern New Jersey, and Tennessee in the months ahead.  I’m also looking at some other interesting groups in the Southwest and Midwest, planning a possible pair of road trips.

How to automatically schedule posts to Twitter, LinkedIn, and Facebook

Over the past few months, I’ve been publishing a LOT of content on this blog, my Telerik blog, and on Twitter.  To simplify the publicity of these blog posts, I have configured a small handful of services to automatically detect new content and broadcast it appropriately.  What follows is a brief run-down of how I have configured this Rube-Goldbergian publicity machine.

It all starts with IFTTT

The main cog in this whole machine is a wonderful web application called IFTTT.  Don’t be intimidated by the acronym, it stands for If This Then That.  This is a web application that listens for events from services that you use and triggers actions on other services that you use.  The beauty of this whole thing is that the application is completely connected with OAuth tokens.  My password never enters any of their screens.

IFTTT runs with a concept called recipes.  These are the mini-programs that you will write to connect a trigger service with an action service.  The almost 50 trigger services vary from the date, to foursquare checkins, and even ESPN sports score reports.  The 60 action services are all kinds of notifications and publications like Twitter, SMS, and even a voice phone call.

Among the collection of trigger services are RSS feeds.  This blog, along with my other blogs, all expose an RSS feed that IFTTT can consume.  From there, I trigger a free service called Buffer.

Scheduling Tweets with Buffer

Buffer is a free web service that will publish messages to Twitter, LinkedIn, Facebook, and App.Net  With this service, you add links and messages to your Buffer message queue and at scheduled times the application writes one message at a time to the target service.  The first messages added to the queue are written first, but the Buffer dashboard gives you an easy drag-n-drop interface to change the order of publication.

Once again, all authentication is done with OAuth, so I am not publishing passwords to this service either.  I also have a plugin installed in Chrome (my current browser of choice) that enabled me to add links and retweets to my buffer without navigating to Buffer.  Very handy.

Buffer has a free plan that will schedule posts to a single account on each of your connected services.  On the free plan, you can only configure one schedule for messages to be posted.  If you need more accounts or more schedule options, there is a paid subscription option that will manage more activity.

When Should I Post Those Tweets?

Finally, to ensure that I have the correct times to schedule these messages, I use another service called Tweriod.  Tweriod has a free service that will analyze your tweets and interactions on Twitter and report a schedule of the best times to post tweets.  This analysis can take some time, and operates asynchronously from your web interaction.  Once you request a report, you should get a notification emailed to you in just a few minutes with your results.

Tweriod can be connected directly to Buffer, so that the scheduled times in Buffer can be automatically configured.  Be sure to set your timezone properly in Tweriod so that you get a report with times that make sense to you.

Putting it all together

The connection of these items is a snap.  With the Tweriod schedule configured in Buffer, I simply created a series of recipes on IFTTT with a Feed trigger that sends a message to Buffer.

Give IFTTT, Buffer, and Tweriod a try and let me know in the comments below if you figure out a better way to manage your publications or other cool IFTTT recipes that you’ve put together.

Yes, you can use GitHub for Windows with BitBucket

I’m a big fan of both GitHub and BitBucket as online version control applications.  I’ve been using git as a source control mechanism since 2010, and these online services since that time as well. Full Disclosure: I’ve been a paying customer of GitHub in the past, discontinuing when I was completed with the project I needed their full services for.  

I have always been very familiar with the command-line and using it to interact with the git source control seemed natural to me. To assist with those less comfortable with the command-line, a front-end for Windows-based developers was released by GitHub called “GitHub for Windows”  In this application, the entire interaction with the source code repository is streamlined and made simple for the user.  The user-interface is configured to make storage of your content with GitHub a “no-brainer”.  But it leads people to question:  “Can I use GitHub for Windows with BitBucket?”

Yes… yes you can.  In the rest of this article, I will show you how to configure this client app to interact with a BitBucket repository.

From the main screen of GitHub for
Windows, click “+ add”

In the resultant “new
repository” window, key in whatever you’d like to name the repository and be
sure to NOT mark the “Push to github” checkbox.  Click ‘Create’ at the
bottom to create the local repository:

Next, in the repository list
window, click the “right arrow” next to the name of the repository you just
created to open that repository:

Click the “tools” gear at the
top of the repository details screen, and select the ‘settings…’ menu item.

In the ‘primary remote (origin)’
textbox, enter the text of the SSH link for the repository on BitBucket. 
You should see this on the right side of the screen on BitBucket:

You’ll need to enter the
command-line to finish this…  from the repository details window, click
“tools” and then “open a shell here”.  This should open either Git Bash or
Windows Powershell configured for git.  Once that is open simply run:

git pull origin master

.. and you’re all synced up

ASP.Net WebForms can do FriendlyUrls too!

With the ASP.Net 4.5 release, we saw the addition of declarative model binding, value providers, unobtrusive validation, and model state. While some of these features were a carry over from the MVC framework, we were still missing that cool SEO friendly friendly URL format.  With the release of ASP.Net 2012.2, Microsoft has delivered on that need.

The FriendlyURLs feature can be added to any ASP.Net WebForms 4 project through the installation of a NuGet package with the command:


Install-Package Microsoft.AspNet.FriendlyUrls -Pre

If you are not currently using routing in your webforms project, you will need to add a call to configure the routing.  If you are familiar with MVC, you should recognize this snippet in global.asax.cs:

RouteConfig.RegisterRoutes(RouteTable.Routes);

The content of your RouteConfig.cs file in the App_Start folder should look something like:

public static class RouteConfig
{
  public static void RegisterRoutes(RouteCollection routes)
  {
    routes.EnableFriendlyUrls(new FriendlyUrlSettings()
    {
      AutoRedirectMode = RedirectMode.Permanent,
      CachingMode = CachingMode.Dynamic
    });
  }
}

I have added the configuration options to make redirects from the FriendlyUrl routes permanent (Http 301 status code) and the dynamically cache the contents of the output.  With this configuration in place, I can now route my users to cool looking and SEO friendly URLs like:

I know what you’re thinking:  Jeff, in MVC I get the entries from the request submitted through to action methods as input parameters, how do I access these values from a webform?  The implementation of FriendlyUrls includes an extension method for the Request object called GetFriendlyUrlSegments that returns an IList<string>.  That’s nice, if you really want to iterate over the entire URL and parse apart what was submitted, but I think there is something else you would prefer.

Enter the ValueProviders

ValueProviders are parameter-level attributes that can be used to decorate methods in your webforms.  Using a FriendlyUrlSegments attribute, I can configure a public method in my webform to provide content based on the values submitted on the URL.  Conside this simple webform:

I can use a new feature in ASP.Net 4.5 and bind my Product business object directly to the FormView control.  All I need to do is specify the ItemType and SelectMethod properties to bind data for read operations.  ItemType is the qualified name of the class that is being bound.  SelectMethod is the public method in the code-behind that will return the business object (Product in this sample) to be presented.  Note that I am using the Item keyword to bind to the Product.  This creates a one-way binding, similar to how we use the Eval keyword.  There is also a BindItem keyword available that performs the familiar two-way binding that the Bind keyword gives us.

Let’s look at the GetProduct method:

public static readonly List ProductList = new List()
{
  new Product
  {
    Id=1,
    Name="Chess",
    Description="The classic game - you know... Chess!",
    Price=9.99M
  }
};

public Product GetProduct([FriendlyUrlSegments]int? id)
{
  if (id == null)
    return null;

  return ProductList.FirstOrDefault(p => p.Id == id.Value);


}

Now we see how the FriendlyUrlSegments value provider is put to use.  When the webform is rendered, at the time that my FormView is ready to bind to data.  I don’t need to fuss with event timings, postbacks, or viewstate.  The webform will pass the parameters appropriately from the FriendlyUrl as an input parameter when the FormView is ready to be rendered.  In this case, I end up with a simple webpage that tells me about the Chess product:

Summary

In this article, we introduced the concept of FriendlyUrls in ASP.Net webforms.  I showed you how to retrieve data from the formatted URL string so that it can be consumed.  We also took a brief look at ValueProviders in ASP.Net 4.5 and used the FriendlyUrls attribute with some declarative data-binding on a standard FormView to present some data.

Next time, we’ll dig further into ValueProviders and ModelBinding in ASP.Net 4.5

WebAPI follow-up.. what about protocol buffers?

As a follow-up to my previous article about WebAPI, I received an interesting question from a friend on LinkedIn:  Is there a way to emit data in Protobuf format?  

This one got me thinking…  ProtoBuf is short for Protocol Buffers which is a data format technology invented by Google to allow for maximum compression of structured data that can be interpreted by any programming language.  Google says: “think XML, but smaller, simpler, and FASTER”, and I think they may be right.  According to Google’s own benchmarks, Protocol Buffer format is 3-10x smaller and 20-100x faster on the wire.

Fortunately, there is a pair of NuGet packages available to help our WebAPI application handle the protocol buffer format.  From within my WebAPI application, I can use the NuGet console to install these two packages with the following command:

Install-Package WebApiContrib.Formatting.ProtoBuf

Once this command completes, the protocol buffers library protobuf-net should be installed as well as the appropriate media formatter in the WebApiContrib.Formatting.ProtoBuf library.  The next step to configure WebAPI is to configure the application to be aware of the new formatter.  This is accomplished with the following command in Application_Start in the global.asax.cs file:

        protected void Application_Start(object sender, EventArgs e)
        {

            GlobalConfiguration.Configuration.Formatters.Add(
				new WebApiContrib.Formatting.ProtoBufFormatter());

            RegisterRoutes(RouteTable.Routes);
        }

Finally, we need to configure the data we are going to transport so that the Protocol Buffers library knows how to serialize it.  This is accomplished with a series of attributes on our Team class:

    [ProtoContract]
    public class Team
    {
        [ProtoMember(1)]
        public string City { get; set; }

        [ProtoMember(2)]
        public string Name { get; set; }
    }

The ProtoContract attribute tells the serializer that this class can be formatted with protocol buffers.  The two ProtoMember attributes assign a location to the properties in the resultant stream of data.  The numeric locations identifiers should be continuous and start with one.

With this minor bit of configuration, we can start our application and open Fiddler to easily test our new protocol buffer aware API.  If I point Fiddler to the standard location http://localhost:4829/api.my I’ll get normal JSON output:

[
{"City":"Philadelphia","Name":"Phillies"},
{"City":"Boston","Name":"Red Sox"},
{"City":"Cleveland","Name":"Browns"},
{"City":"Houston","Name":"Astros"},
{"City":"San Diego","Name":"Chargers"}
]

but if I go into the Composer tab of Fiddler and submit the request with an accept header of application/x-protobuf:

The results returned are the protocol buffer format, not entirely visible to us as text:

We didn’t change any of our business logic.  We didn’t change any of our data access code.  By simply adding an additional media formatter, WebAPI was able to handle and format the resultset as requested.  What else could you format with WebAPI?  What other request formats could you interact with?  Let me know what you think in the comments below.