On the tail end of our WPF client project, we started getting a ‘Task is cancelled’ exception from a method that posts JSON data to a REST API. I know that this exception is just a pointer to the actual problem and I’m confident that it’s not a code issue because the last modification made to the method was 4 weeks ago.

It turns out, since we started testing rigorously, we are sending large amount of JSON to the API. Our knee-jerk reaction was to set  the amount of JSON we can post to the maximum value (of course):

protected virtual string Serialize(object obj, bool maxResult) {
            var javaScriptSerializer            = new JavaScriptSerializer();
            javaScriptSerializer.MaxJsonLength  = Int32.MaxValue;
            return javaScriptSerializer.Serialize(obj);
}

We’ve mistakenly thought that this solved the problem. Few days later, we started getting the same error again so we decided to take a long-term approach on the issue. We cannot hope that there will be less data. We can, however, chop the data and send them by batch:

public virtual async Task PostBatch<T> (string path, IEnumerable<T> collection, SyncServiceContainer<T> container, int take) {
            int max               = (collection.Count() / take) + 1;
            var contents          = new List();
 
            for (int i = 0; i < max; i++) {
                var slicedCollection   = collection.Skip(take * i).Take(take).ToList();
                container.Data         = slicedCollection;
                contents.Add(SerializeContent(container));
            }
 
            foreach (var content in contents) {
                await Post(path, content);
            }
}

A few things can be said about the PostBatch method. First, the collection parameter is data that needs to be chopped. We used IEnumerable<T> so it can accept any type. Second, the container parameter is just an object that wraps the collection. That class has a property called Data which will contain the chopped collection. The API that receives the data should have an object parameter the same as SyncServiceContainer object.

This resolved our problem completely.

 

Rappler:

TV executives are redefining their business models as they navigate the television industry’s shift to a multimedia sphere.

“It’s not the future. Digitalization is today. Physical capital barriers have crumbled as ‘open content’ on the Web undermines the traditional scheme of delivering news,” ABS-CBN chief digital officer Donald Lim said on the sidelines of the Philippine Marketing Association event in June.

I disagree. This is not just a medium problem. If it is then they can make their contents available online and their problem would go away. I’d argue that some of their contents are already online yet they are still in the same predicament. The problem is traditional content. It is really baffling that up to now, local TV networks are still using the same old, tired formula: pop star-ridden variety shows and soaps. The creativity has stagnated.

Television is not an endangered medium—I believe it is undergoing the same transformation that smartphone had, but I digress—people are still crazy with Game of Thrones and The Walking Dead. These are TV-first content and they are great content. Shows like Last Week Tonight are very successful because they are breaking the barriers of traditional content. They experiment. They create thoughtful, entertaining content.

I think local TV networks are out of ideas. They are desperately feeding the fickle-minded masses and they know it is not sustainable. If redefining business model and workforce downsizing are the only things up their sleeve then they need to brace themselves. Shit is about to hit the fan.

We’ve been using Slack for a while now and we find it really useful when sending a receipt notification: an automated, pre-formatted message from an application. For example, you might want Dropbox to notify you after a file is copied to a certain directory.

In one of our projects, we are performing some back office task of importing files from an external source. It’s a lengthy and repetitive chore so I decided to create a nifty tool to automate the task.  Everything works great except that we have to constantly remote the server to check whether the import is finished. I figure that this would be a perfect use case for Slack APIs. Slack has an open and well-documented APIs. All I needed was to use web hooks which is basically just HTTP calls. There are several third-party library for .NET that encapsulate Slack’s API calls that can be found here. I highly recommend that you go plain vanilla and use plain HTTP calls. The steps are fairly trivial: setup your incoming web hook integration here (login needed). Once done, copy your web hook URL. Send a POST request with a JSON payload to your web hook URL.

So here’s how I did it. First, following the DRY principle, I have a base class for all of the basic HTTP calls. Below is a snapshot of my base class. You might want to add more methods like Get or Put for GET and PUT HTTP methods, respectively. Take notice of  the `SerializeContent` method. That’s how we will serialise local objects to JSON.

    public abstract class BaseSyncService : BaseService {
 
        private HttpClient _HttpClient;
        public HttpClient HttpClient {
            get {
                if (_HttpClient == null)
                    _HttpClient = new HttpClient {
                        BaseAddress = new Uri(BuildAPIURL())
                    };
 
                return _HttpClient;
            }
        }
 
        private string BuildAPIURL() {
            return ConfigSettings("APIEndPointURL");
        }
 
        protected virtual HttpContent SerializeContent(object obj) {
            return new StringContent(new JavaScriptSerializer().Serialize(obj), Encoding.UTF8, "application/json");
        }
 
        public virtual async Task Post(string path, HttpContent content) {
            using (HttpClient) {
 
                HttpResponseMessage response = new HttpResponseMessage();
                var result = await _HttpClient.PostAsync(path, content);
 
                return result.Content.ReadAsStringAsync().Result;
            }
        }
 
    }

So I can make a derived class called SlackSyncService, which looks like something like this:

    public class SlackSyncService : BaseSyncService {
        public async Task Notify(string message) {
            return await Post ("", SerializeContent(new { 
                                                        text        = message,
                                                        username    = "Import Bot",
                                                        icon_emoji  = ":thumbsup:"
                                                    }));
        }
    }

Now we’re getting somewhere. Keep in mind that our Post method is awaitable which will beautifully wait for Slack’s API response. Additionally, we can use .NET’s anonymous type to act as a container for our data. This can be serialised to JSON perfectly. There are so many settings to customise your post. You can find everything here.

Now I can just call this service class somewhere in my ViewModel:

        private async Task NotifySlack() {
            if (ShouldNotifySlack) {
                try {
                    await new SlackSyncService().Notify(String.Format("Files successfully imported! Total import time is {0}", ElapsedTime));
                } catch (Exception exception) {
                    EventHandler onExceptionFileImport = OnExceptionFileImport;
                    if (onExceptionFileImport != null)
                        onExceptionFileImport(this, exception.Message);
                }
            }
        }

(I know some of you are probably wondering why I seem to use new keyword willy-nilly and that this is anti-pattern but I have my reasons and it is a good topic for a separate blog post).

That’s it. Adding Slack integration to a .NET app is really easy and fun. It’s a great weekend project. Happy coding!

 

We recently started a desktop client app for one of our clients. Being a .NET shop, we naturally picked WPF over others for two main reasons: 1) WinForms is practically dead and 2) WPF can target broader platform versus WinRT/Universal Apps. WPF is a great framework; the separation between UI and code-behind code is akin to ASP.NET MVC’s Razor. The biggest pain point, however, is not being able drill down on UI code the way web developers could inspect HTML, CSS and JavaScript inside a browser. It’s a such a huge problem.

While browsing MSDN, I was blown away that such functionality now exists in Visual Studio 2015. The video discusses two of the XAML’s UI debugging tools: Live Visual Tree and Live Property Explorer. If you’re a web developer, you can pretty much guess how these tools behave: inspect your element hierarchy while the application is running. I’ve embedded the video below. Enjoy!

There’s nothing more gratifying to a developer than finishing a weekend project ahead of time. I started a building a WPF desktop client around 1PM yesterday. Roughly twelve hours later, I was able to put up a stable, working build. That includes a 30-minute walk, occasional play time with the kids and an hour-and-a-half massage. Granted I reused some of my libraries but still, that’s not too shabby.

I need some caffeine.

Hot off the press, Lazada.com.ph offers Apple Watch locally, ahead of mostly anyone else (I checked few online stores and none of them offer it yet, there are a few sellers in OLX, however). The catch: the price is ridiculous. The Stainless Steel (42mm, screen grab) costs P80,890, a near 100% markup from the original price. The Sport (42 mm, screen grab) is even crazier. It costs P68,890a whopping near-400% markup. It gets worst. If you read the fine print, there is no product warranty!

The previous price for both watches is, get this, P100,000, which makes the current prices appear discounted. So you are “saving” 19% and 31% for the Sport and the Steel, respectively.

Hilarious.

Speaking of ports, CNET has a good rundown about USB Type-C and its previous incarnations. This little nugget caught my attention:

Type-C USB also allows for bi-directional power, so apart from charging the peripheral device, when applicable, a peripheral device could also charge a host device. All this means you can do away with an array of proprietary power adapters and USB cables, and move to a single robust and tiny solution that works for all devices. Type-C USB will significantly cut down the a amount of wires currently needed to make devices work.

Sweet. One power bank to charge them all?

TechCruch:

Apple just announced its latest MacBook. It’s tiny… This MacBook only has a single USB-C and it does everything from charging, to sending video out and transporting data.

Predictably, a lot of people are criticising Apple for the move. This is the same reaction I made during its launch on January 2008 when Apple decided not to include CD-ROM in the MacBook Air. Guess what? I am writing this blog on a 13″ MacBook Air. I get why Apple is doing this. I also understand why most people are disappointed. If you are one of those people, the answer is simple: this product is not for you, yet. People have hundreds of options out there including Apple’s own products: MacBook Air and MacBook Pro.

The new MacBook is a forward-thinking product. Imagine going back being in 2005 and using a machine where you can start working the second you open your laptop’s lid up. No boot time, no waiting. When you need to connect to a device, you do not plug anything to your machine, it connects wirelessly. It does all the computing in the background, wireless and silently. It makes everything simple and easy. Familiar? That’s because it’s possible now and the new MacBook further solidifies the concept.

I am not claiming that the new MacBook will be the only one to do this but, compared to any device in the market right now, it has the clearest intention to do this. It might also be the best device to execute such dream.

A machine that doesn’t go in your way so it starts to disappear. That’s what this device is all about.