Editor’s Note: Today’s post, written by Linx e-Commerce Program Manager Fernando Chaves, describes how the company uses Windows Azure to scale out its LinxWeb Point-of-Sale system for its customers.
Linx is a 26 year-old ISV, and leader of ERP technologies to the retail market in Latin America. We have more than 7,500 customers in Brazil, Latin America and Europe, with more than 60,000 installed Point of Sales (POS) systems. Our company has more than 1,800 employees at our headquarters and branches, and a network of partners spread throughout Brazil and abroad.
LinxWeb is a white-label B2C e-Commerce solution that our customers can use as a new POS system in their sales environment. It’s integrated with customer on-premises ERP environments, and can be managed just like a traditional POS system while allowing specific customizations such as promotions.
Setting the Stage: Before Windows Azure
Before migrating to Windows Azure, LinxWeb operated on virtual machines (VMs) running in a traditional hosting provider. Though, in theory, this kind of deployment could scale out, it was not easy and fast to achieve and often we needed to scale-up, adding more memory, computing power or network bandwidth to the VM.
LinxWeb was originally single-tenant where every customer had his or her own deployment and environment. Customization was done directly on the customer’s web content files, which could lead to security issues, quality control issues and generation of excessive support requests due to customization errors.
Before Windows Azure migration, the web site was responsible for every processing task: generating product image thumbnails, sending e-mails, and communicating with third-party systems. Every task was done synchronously, impacting e-commerce web site performance and availability to end customers.
The Migration to Windows Azure
When we decided to migrate LinxWeb to Windows Azure, some refactoring was needed, to make it compatible with the stateless nature of Windows Azure web roles and load balancer.
Since each web request could be sent to any web server instance, we needed to externalize session data. We chose Windows Azure SQL Database for our session storage.
We had to remove all file writing to the local disk, since local disk storage isn’t shared between server instances. Additionally, local disk is not durable, unlike Blob Storage or SQL Database, which have replicated disks. Local disks are designed for speed and temporary usage, rather than permanent storage.
Media content, initially saved in SQL Server (in BLOB columns), now are stored in Windows Azure Blob storage, allowing better scalability for the website, since blob content can be cached on Windows Azure Content Delivery Network (CDN) edge cache. Also, by storing only a blob reference in the SQL Database, rather than an entire media object, we are able to keep the size of our SQL Database much smaller, helping us avoid storage limits on individual SQL databases (supported up to 150GB at the time).
Since blobs (and CDN) are referenced with a URL, browser requests for media now go directly to CDN, bypassing our web role instances (and taking load off IIS and database). This modification presented an average 75% reduction on database size and also saved money on storage costs, since blob storage is much cheaper than SQL Database. We also saw response time improve on our Web Role instances, since considerable load was taken off of these servers.
To better use the environment resources, the Windows Azure version was made with multi-tenancy in mind, where multiple customers share compute resources, reducing hosting costs. Understanding some customers may want an isolated environment, we also have a premium offer where a customer receives a dedicated deployment. On this new version, the customer doesn’t update ASP.NET pages directly to change the site layout and look and feel any more. They are allowed to change templates stored in Windows Azure Blob storage, and then the ASP.NET pages process those templates to render updated html to the end user.
Worker roles were used to handle background tasks such as generating picture thumbnails and sending e-mails. Those tasks are queue-driven, using Windows Azure Queues. The worker roles are also responsible for running scheduled tasks, mainly for communication with third party systems. To manage the time handling, we used the Quartz.Net framework, which has the option to run synchronized on multiple worker role instances. This is a very important point: If a scheduler is set up to run in a worker role, that scheduler runs in all instances. Quartz.Net ensures that only one scheduler instance runs at any given time.
Some customers also may want to host a company website or a blog together with their e-commerce site. To solve this need, we use WordPress as our blog engine. WordPress is PHP-based and, by default, the PHP runtime libraries are not installed in Windows Azure web or worker roles. Since our WordPress blog runs on Windows Azure web roles, we needed to install the required PHP components as well as WordPress itself. We did this with startup tasks and Web Platform Installer Command Line to setup the PHP runtime on IIS. A Windows Azure SQL Database is used as persistent storage, as well as Windows Azure Blob storage, so we also installed the Windows Azure Storage plugin for WordPress, which uploads files from users directly to blob storage.
For us, the main benefit of migrate our solution to Windows Azure is how easy and fast it is to scale out the application. This lets us focus on business needs from our customers and support marketing campaigns handling a large number of requests from our end users.
For our customers, a big benefit is that they no longer need to worry about infrastructure and operational system management.
As pointed out, we had a few technical challenges to solve, none of them insurmountable:
- Moving from single- to multi-tenancy
- Moving local storage and SQL storage to blob storage and CDN
- Scheduling tasks with Quartz.net across multiple role instances
- Installing PHP runtime and WordPress
- Refactoring web request handling to be stateless and scalable across multiple instances
We were able to handle all of these challenges and now have a very efficient application running in Windows Azure!