The other day was an especially fun day at IQMS. The president of the company, Randy Flamm, returned from a two week vacation in the Mediterranean Sea, rejuvenated, smiling and in a great mood … and so was I! My Automation Department development team had made tremendous progress during Randy's vacation and we were eager to present our progress. The morning of our meetings, I arrived at 8 a.m. and there were no urgent emails waiting in my inbox, except for one from our lead developer. Quickly I noticed the CC’d column … it contained the president’s email address!
“Jason – Please see what you can put together quickly to do a performance analysis on our Web API in JSON format versus a desktop/native connection to the database.”
After my heart started beating again, I said to myself with a sigh of relief and a grin on my face, “Oh, my!” As a purist (many might say anal, overachiever, perfectionist … yada, yada), I love a challenge and love it even more when I come up with a clever solution.
After about an hour constructing a client application that could communicate via web services to the IQMS EnterpriseIQ database, I had a simple test application. It could pass any SQL statement to the database, return the result of the query and the approximate time it took to send the SQL statement over the web, have the web server receive the request, pass the request to the database, have the database parse the SQL of the request, pass the result data back to the web server from the database, reconstruct the data into JSON format, send it back over the web to the client application, have the client application parse the results and display the data along with the time it took from start to end on the desktop application. Whew! All in an hour’s work right?! Reminded me of a blog I posted a few months back on measuring throughput.
With the application in place, we started running the tests. The results were not surprising considering the complexity of the application (which was as simple as you can get): It was slow compared to a native/desktop application. My pride was not broken yet though … after all, this is only the first build. Not too shabby at 5x slower than the native equivalent, but nothing to blog about either! But after several attempts at trying to stream data through the multiple layers, I could only muster 4x slower.
The tests proved that cloud computing is slow for large amounts of data. In an ERP environment, where you’re dealing with more than simple websites that tell you where to eat, where the next Starbucks is and what movies are playing this weekend at the theatre, there’s a lot to consider. Don’t get me wrong, there are certain aspects of any system that can benefit from the cloud: a customer storefront, employee web portal, mobile phone access for online quotes and a few other remote access interfaces. However, do you really want your master customer lists, internal documentation, vendor lists, trade secrets, customer credit card information and payroll information sitting on a cloud computer hosted by an ISP for pennies per GB? I would guess your answer is not no, but NO!
I’m digressing though, because this blog is about evaluating performance. How do we increase the speed or perceived performance of an application? After a couple more hours of playing with binary serialization, compression, caching and threading, the answer was clear. There is not much more I could do to improve the raw data transmission speed over the Internet. After all, you can’t refine a data transmission statement any smaller than how many 1s and 0s can be transmitted per second.
Learn how you can apply the highest level of excellence to every aspect of your manufacturing plant: