Pages

Friday 8 February 2019

CelerFT - Uploading gigabit sized file in chunks with resume

CelerFT - Uploading gigabit sized file in chunks with resume


From time to time persons have reached out to me in relation to the Gigabit File upload articles, and one of  the things that I have been asked about is how to resume a file upload. In response to that question  I have been working on an update to the Gigabit File upload project and I have added it to my GitHub Repository,

This version is a complete rewrite of the Gigabit File upload project and the aim was to achieve the following:

  1. Rewrite the JavaScript code as an Immediately-Invoked Function Expressions (IIFE).
  2. Use nested web workers to handle the file uploads. If the browser does not support nested web workers then the subworkers.js polyfill can be used to meet this requirement.
  3. To be able to resume a file upload. Resuming an upload will not overwrite the file if it already exists.
  4. Provide complete samples of the back-end server code  in ASP.NET and Node.js

Friday 15 April 2016

Gigabit File uploads Over HTTP - The Node.js with NGINX version

Gigabit File uploads Over HTTP - The Node.js with NGINX version


Please see the ASP .Net version of the article. It provides background information that might not be covered here.

Please see the original NODE.js version of this article.

One of the things that we wanted to do after blogging about Gigabit File uploads with Node.js was to see how we could improve the performance of the application. In the previous version of the application the code that was written was mostly synchronous and as a result of that  we had high CPU usage, did quite a lot of I/O, and used up a fair amount of memory. All in all what was created had more to do with demonstrating the concept of how to do the Gigabit File uploads over HTTP rather than for performance.

Now that we have established the concept it is now time to see how the application's performance can be improved.

Friday 2 October 2015

Getting Started with foreScript

Getting Started with foreScript

I have been using Powershell for some time now and I think that it is a great addition to the toolset of an IT Professional. However the more I used Powershell, the more I found myself having  to write the same pieces of code over and over again in my scripts. Those pieces of code had to do with:

  • Getting a list of computers to run the script against - You know the drill and that meant using a cmdlet to get a list of computers from either a csv file, or a text file, or the command line.
  •  Getting the information for user authentication - If I wanted to run the script as another user I would end up either prompting for the required credentials, or elevating privileges at run time, or even running Powershell as an administrator.
  • Formatting the output of the Powershell scripts for display purposes - This speaks for itself. I have written quite a few Powershell scripts that have code that is only used for converting the  output  into html.

I could have written a module that would handle all of those issues for me and use that module in every Powershell script that I wrote. However I wanted to do much more than that. I wanted to  create a framework that would support the execution of Powershell scripts.

Saturday 19 September 2015

BSOD REGISTRY ERROR 51

BSOD REGISTRY ERROR 51

It wasn't the way that I expected to spend my day, but things don't always turn out the way that you expect. I had just received a call that one of our production Terminal Servers was rebooting, and to make matters worse it was the server that could handle the majority of our workload. I logged unto the server and as I went  through the System Event log I saw the message:

"The computer has rebooted from a bugcheck.  The bugcheck was: 0x00000051 (0x00000004, 0x00000001, 0xe7079d70, 0x00000238). A dump was saved in: C:\WINDOWS\MEMORY.DMP."

Friday 15 May 2015

Automation Nightmare - Updating Excel Spreadsheet from Powershell

Automation Nightmare - Updating  Excel Spreadsheet from Powershell

The assignment was very simple. All I had to do was to login into SQL Server Management Studio (SSMS), run a query ,  copy and paste the results of the query into an Excel Spreadsheet Template and then save that template using an agreed naming convention. The only problem was that I had to run the query with different parameters and that would mean doing it nearly one hundred times.

So I dutifully began my task and after a few hours I had only extracted the data for about three of the scenarios. The way that I was going about doing the extract would take forever and a day and I did not have that much time.

They say that "If you have a hammer, everything looks like a nail" and so I decided to use my Powershell hammer to nail my data extraction problem, so to speak. As far as I was concerned Powershell was the ideal tool for the job. I could use Powershell to run the queries against the SQL Server database and return the data in a dataset. Once I created the dataset I would then  loop through the data and update each worksheet in the Excel Spreadsheet Template  and then save the spreadsheet in the agreed naming convention.  Life could not be easier.

Wednesday 5 November 2014

Gigabit File uploads Over HTTP - The Node.js version

Gigabit File uploads Over HTTP - The Node.js version


Please see the ASP .Net version of the article. It provides background information that might not be covered here.

Updated version that uses NGINX reverse proxy and asynchronous code

Like it or not JavaScript is everywhere. It can be found in front end applications on the client, it can be found in a variety of frameworks and  libraries, and it can be found in  backend applications in server environments.

The popularity of JavaScript has certainly increased in recent years, and that seems to be because the JavaScript Ecosystem is helping to  improve productivity and to reduced the time it takes to get an application through the door.  So after my first post, in which I blogged about doing Gigabit File uploads using an ASP. NET Web API backend, I decided to see if I could achieve the same thing using Node.js. This means that I will be implementing the "UploadChunk" and the "MergeAll" methods that I spoke about in my last post in Node.js.


Friday 17 October 2014

Gigabit File uploads Over HTTP

Gigabit File uploads Over HTTP


Please see the NODE.js version of this article.

Updated version that uses NGINX reverse proxy and asynchronous code

Large file uploads how do I love thee? Let me count the ways. These days having to deal with large file uploads is rather common place in IT environments, and by large file uploads I mean files that are over 500 MB in size.  Sure we have replication technologies that can help us to keep our data in sync, but there is still that need to move large amounts of data on a regular basis.

 Most of the times that I had to move anything over 500 MB in size I would typically split the file into several smaller files using a file compression utility and then upload these smaller files via ftp or secure shell (ssh).  Once all of the smaller files have been uploaded I would use the file compression to recreate the original file. However that required the setup of an FTP server or a Secure Shell server and the use of a third party file compression utility.

So I asked "With the prevalence of  web browsers and web servers in the IT environment could I  accomplish the same thing using the HTTP protocol?"

A quick search on the Internet showed that web servers will typically limit the maximum file size that can be uploaded to anywhere between 2 GB and 4 GB, and in addition to that most web browsers will allow you to upload around 2 GB. I suppose the reason for that is that the Content-Length header is being treated as a signed 32-bit integer in the web browser and the maximum size of a signed 32-bit integer is 2,147,483,647.

As I continued to search I began to look at the HTML 5 specification and the APIs that  are a part of that specification because the indication was that these new APIs would allow me to upload files greater than 2 GB in size over the HTTP protocol. I also came across examples of code on the Internet that indicated what could be done, but not a complete example of how it could be done.