Skip to main content

GoReplay - Testing Your Site with Actual Traffic

Goal: 
In these article we gonna learn How to capture your Real Time traffic from production and reuse it at your testing/development environment.

Prerequisite:
One web server running, or If you are just playing around then you can run goreplay test ftp server.

Let's Begin
Load Testing for site serving millions user wasn't be that easy before I came to know GoReplay. Here I am not gonna explain you How great go replay is, You will automatically get to know after following steps above step to capture and replay your request logs. FYI GoReplay capture logs from tcpdump.

Installation:
Download zip file from there git repo and unzip it.
#create a directory
mkdir ~/goreplay
#go to directory you created
cd ~/goreplay
#download tar file from goreplay git repo
wget https://github.com/buger/goreplay/releases/download/v0.16.1/gor_0.16.1_x64.tar.gz
#unzip it
tar -xf gor_0.16.1_x64.tar.gz

After Unzipping Check GoReplay binary File is available in directory.

Capture Request:
It will basically capture request coming on port 80 on that server.
sudo ./goreplay --input-raw :80 --output-file filename --exit-after 5m 

This command will basically capture request coming on port 80 and store that data into filename and will stop capturing after 5 minutes.

Replay Request:
It will basically replay same traffic on testing machine.
sudo ./goreplay --input-file filename --output-http "http://domain.com/"

This command will basically hit request that's already stored in filename to given http domain.

Conclusion:
Yes, I told to you Testing your development site with actual production requests wasn't that easy.

For Further Reading, please check following Links

GoReplay Documentation
https://github.com/buger/goreplay/wiki/Getting-Started

A Blog on medium by Leonid Bugaev
https://leonsbox.com/goreplay-v0-16-and-4th-anniversary-5408b1fd72e0

Comments

Popular posts from this blog

Curator

Goal: In these tutorial we gonna cover deletion of old logs in ELK Stack. We gonna achive these by deleting old indices created by Logstash while dumping logs in Elasticsearch. Prerequisites: Old logs to delete... 😜😜 Let's Begin the exercise: Install curator Curator is a package in Elasticsearch  repository to delete old indices. Create a file sudo vi /etc/yum.repos.d/curator.repo paste following lines Save and Exit file Run yum install sudo yum install elasticsearch-curator Configure Curator Create a directory mkdir ~/.curator/ Open a file sudo vi ~/.curator/curator.yml paste following code Save and Exit file Deletion pattern Create file to define delete pattern in Elasticesearch sudo vi ~/.curator/delete_indices.yml paste following lines in file Create a log file for curator on the location you defined in configuration, and assign permission to right into file. sudo touch /var/log/curator #to assign permission to write l

Install Central Logging on Amazon Linux

Goal: In these tutorial we gonna cover setup of central logging system on amazon linux (CentOs) in same aws vpc . We will setup one central log server to receive log using rsyslog, after that we will setup one client to forward apache & syslog to central server. we already covered forward logs from central log server to ELK stack for analyzing. Logging Stack Component: Central Log server Multiple logging client server/Any apache web server generating logs Rsyslog: we setup with rsyslog v8-stable. You can use any rsyslog  version after rsyslog-6, because we encountered rsyslog drop message in earlier version. Prerequisites: Rsyslog is quite light weight, we doesn't requirement any high configuration machine, aws t2.micro should be enough. We are running t2.micro in production for central log server to receive around 1000 log entry/second, server is using less then 2 percent/sec within same vpc. Now Let's Start we gonna break these tutorial in two pa