Increase your Linux server Internet speed with TCP BBR congestion control

Posted on in Categories Cloud Computing last updated July 22, 2017

I recently read that TCP BBR has significantly increased throughput and reduced latency for connections on Google’s internal backbone networks and google.com and YouTube Web servers throughput by 4 percent on average globally – and by more than 14 percent in some countries. The TCP BBR patch needs to be applied to the Linux kernel. The first public release of BBR was here, in September 2016. The patch is available to any one to download and install. Another option is using Google Cloud Platform (GCP). GCP by default turned on to use a cutting-edge new congestion control algorithm named TCP BBR.

How to use parallel ssh (PSSH) for executing ssh in parallel on a number of Linux/Unix/BSD servers

Posted on in Categories Cloud Computing, Command Line Hacks, Howto last updated April 21, 2017

Recently I come across a nice little nifty tool called pssh to run a single command on multiple Linux / UNIX / BSD servers. You can easily increase your productivy with this SSH tool.

More about pssh

pssh is a command line tool for executing ssh in parallel on some hosts. It provides specialties includes:

  1. Sending input to all of the processes
  2. Inputting a password to ssh
  3. Saving output to files
  4. IT/sysadmin taks automation such as patching servers
  5. Timing out and more

Let us see how to install and use pssh on Linux and Unix-like system.

pssh-welcome

Book Review: Tarsnap Mastery Online Backup For the Truly Paranoid

Posted on in Categories Amazon Web Services, Cloud Computing, Reviews last updated January 4, 2016

Tarsnap Mastery: Online Backups for the Truly Paranoid (IT Mastery) It’s always a good idea to keep backups of all of your data in multiple places. Every Linux or Unix sysadmin must master the art of backups if you want to keep your data forever. Most sysadmin recommend and follows the 3-2-1 rule:

  1. At least three copies of data.
  2. In two different formats.
  3. With one of those copies off-site.

Tarsnap is one of such off-site backup sites. It’s a secure online backup system for UNIX-like system. This service encrypts and stores data in Amazon S3. To use Tarsnap perfectly and feel secure about your backups, you need the “Tarsnap Mastery” book by Michael W. Lucas. It is no secret that I’m a big fan of his book series. Let’s see what the book is all about.

7 Awesome Open Source Cloud Storage Software For Your Privacy and Security

Posted on in Categories Cloud Computing, Datacenter, Hardware, Open Source, Storage last updated May 7, 2017

Cloud storage is nothing but an enterprise-level cloud data storage model to store the digital data in logical pools, across the multiple servers. You can use a hosting company such as Amazon, Google, Rackspace, Dropbox and others for keeping your data available and accessible 24×7. You can access data stored on cloud storage via API or desktop/mobile apps or web based systems.

In this post, I’m going to list amazingly awesome open source cloud storage engines that you can use to access and sync your data privately for security and privacy reasons.

How To Use Vagrant To Create Small Virtual Test Lab on a Linux / OS X / MS-Windows

Posted on in Categories Cloud Computing, Virtualization last updated June 3, 2017

Vagrant is a multi-platform command line tool for creating lightweight, reproducible and portable virtual environments. Vagrant acts as a glue layer between different virtualization solutions (Software, hardware PaaS and IaaS) and different configuration management utilities (Puppet, Chef, etc’). Vagrant was started back at 2010 by Mitchell Hashimoto as a side project and later became one of the first products of HashiCorp – the company Mitchell founded.

While officially described as a tool for setting up development environments, Vagrant can be used for a lot of other purposes by non developers as well:

  • Creating demo labs
  • Testing configuration management tools
  • Speeding up the work with non multi-platform tools such as Docker

In this tutorial I’ll show how can we take Vagrant as use it to create small virtual test lab which we will be able to pass to our colleagues.

Amazon AWS Route 53 GEO DNS Configurations

Posted on in Categories Amazon Web Services last updated April 28, 2013

You can send visitors to different servers based on country of their IP address using Amazon Route 53 cloud based dns server. For example, if you have a server in Amsterdam, a server in America, and a server in Singapore, then you can easily route traffic for visitors in Europe to the Amsterdam server, people in Asia go to the Singapore server and those in the rest of the world be served by the American server. This will results into the various kinds of benefits such as:

  1. Better performance as you are sending web site visitors to their nearest web server.
  2. Reduced load on origin.
  3. Geomarketing/online advertising.
  4. Restricting content to those geolocated in specific countries (I am not a big fan of DRM).
  5. In some cases you can get potentially lower costs and more.

In this post, I will explain how to configure and test GeoDNS using AWS Route 53 service.

Amazon Cloudfront Dynamic Content Delivery With A WordPress Blog

Posted on in Categories Amazon Web Services, Content Delivery Network last updated March 7, 2013

A typical Wordpress blog contains a mix of static stuff such as images, javascript, style sheets and dynamic content such as posts, pages and comments posted by users. You can speed up your blog by serving static content via content delivery network such as Akamai, Edgecast and so on. The big boys of CDN business also offered the solution to accelerate dynamic content to improve the performance and reliability of the blog. However, solutions offered by big and traditional CDNs are expensive. Amazon cloudfront recently started to serving dynamic content at lowered price. In this blog post, I will explain:

  1. How to serve your entire blog using cloudfront.
  2. DNS settings.
  3. Wordpress settings.
  4. Documenting limitations of cloudfront.
  5. Documenting performance improvements.

Testing HTTP Status: 206 Partial Content and Range Requests

Posted on in Categories Cloud Computing, Command Line Hacks, Howto, Networking, Web Developer last updated November 17, 2012

The HTTP 2xx class of status codes indicates the action requested by the client was received, and processed successfully. HTTP/1.1 200 OK is the standard response for successful HTTP requests. When you type www.cyberciti.biz in the browser you will get this status code. The HTTP/1.1 206 status code allows the client to grab only part of the resource by sending a range header. This is useful for:

  1. Understanding http headers and protocol.
  2. Troubleshooting network problems.
  3. Troubleshooting large download problems.
  4. Troubleshooting CDN and origin HTTP server problems.
  5. Test resuming interrupted downloads using tools like lftp or wget or telnet.
  6. Test and split a large file size into multiple simultaneous streams i.e. download a large file in parts.

Amazon Glacier: Cloud Storage For Archives And Backups Launched

Posted on in Categories Amazon Web Services last updated August 21, 2012

Amazon web services (AWS) launched a new service called Amazon Glacier. You can use this service for archiving mission-critical data and backups in a reliable way in an enterprise IT or for personal usage. This service cost as low as $0.01 (one US penny, one one-hundredth of a dollar) per Gigabyte, per month. You can store a lot of data in various geographically distinct facilities and verifying hardware or data integrity, irrespective of the length of your retention periods. The first thing comes to mind is, the Glacier would be a good place for a backup off family photos and videos from my local 12TB nas.

Amazon Announces SSD Storage Based High I/O EC2 Server / Instance

Posted on in Categories Amazon Web Services last updated July 19, 2012

Excellent news. This may come handy. In our data center we have a few servers for just two applications. These applications are just run for 2 or 3 days a month and then the rest of the time all servers in rack just sit idle. It is a waste of servers, time, energy and resources. This is a good use-cases for on-demand high I/O server(s), where I need low-latency and are an exceptionally good host for NoSQL databases such as MongoDB.