Pages

Wednesday 5 November 2014

Gigabit File uploads Over HTTP - The Node.js version

Gigabit File uploads Over HTTP - The Node.js version


Please see the ASP .Net version of the article. It provides background information that might not be covered here.

Updated version that uses NGINX reverse proxy and asynchronous code

Like it or not JavaScript is everywhere. It can be found in front end applications on the client, it can be found in a variety of frameworks and  libraries, and it can be found in  backend applications in server environments.

The popularity of JavaScript has certainly increased in recent years, and that seems to be because the JavaScript Ecosystem is helping to  improve productivity and to reduced the time it takes to get an application through the door.  So after my first post, in which I blogged about doing Gigabit File uploads using an ASP. NET Web API backend, I decided to see if I could achieve the same thing using Node.js. This means that I will be implementing the "UploadChunk" and the "MergeAll" methods that I spoke about in my last post in Node.js.


Friday 17 October 2014

Gigabit File uploads Over HTTP

Gigabit File uploads Over HTTP


Please see the NODE.js version of this article.

Updated version that uses NGINX reverse proxy and asynchronous code

Large file uploads how do I love thee? Let me count the ways. These days having to deal with large file uploads is rather common place in IT environments, and by large file uploads I mean files that are over 500 MB in size.  Sure we have replication technologies that can help us to keep our data in sync, but there is still that need to move large amounts of data on a regular basis.

 Most of the times that I had to move anything over 500 MB in size I would typically split the file into several smaller files using a file compression utility and then upload these smaller files via ftp or secure shell (ssh).  Once all of the smaller files have been uploaded I would use the file compression to recreate the original file. However that required the setup of an FTP server or a Secure Shell server and the use of a third party file compression utility.

So I asked "With the prevalence of  web browsers and web servers in the IT environment could I  accomplish the same thing using the HTTP protocol?"

A quick search on the Internet showed that web servers will typically limit the maximum file size that can be uploaded to anywhere between 2 GB and 4 GB, and in addition to that most web browsers will allow you to upload around 2 GB. I suppose the reason for that is that the Content-Length header is being treated as a signed 32-bit integer in the web browser and the maximum size of a signed 32-bit integer is 2,147,483,647.

As I continued to search I began to look at the HTML 5 specification and the APIs that  are a part of that specification because the indication was that these new APIs would allow me to upload files greater than 2 GB in size over the HTTP protocol. I also came across examples of code on the Internet that indicated what could be done, but not a complete example of how it could be done.


Tuesday 29 July 2014

Create and execute SSIS packages with Powershell

Create and execute SSIS packages with Powershell

The other day I was working on a project to migrate the backend of an application to Microsoft SQL Server 2008 R2 from another RDBMS engine. I figured that this would be quite easy to do, but when I tried to migrate the database using the SQL Server Import and Export wizard I started to get errors and the data migration failed. After trying several other methods to migrate the database we finally  decided that we would create a version of the  database in SQL Server and then use SQL Server Integration Services (SSIS) to migrate the data from the other RDBMS engine to Microsoft SQL Server 2008 R2.

This sounded easy enough until I realized that we would have to create over 900 table mappings between the source and destination databases  using the SQL Server Business Intelligence Development Studio (BIDS). I was not looking forward to doing that. So what should I do?

I spent the rest of that afternoon doing some research and I kept coming across blog postings that indicated that I could create an SSIS packages dynamically from a template. This sounded like a good idea and I was intrigued by the concept of automating the creation of all of the SSIS packages that I would need to do the migration of the data from an SSIS template. The only problem with doing that was that I would now have over 900 SSIS packages that I would need to run simultaneously and there was no easy way to do that in BIDS. Bummer!