Skip to main content

Posts

My First Azure Function app using CLI

Goal:      Here we will learn about creating and deploying our first azure function app on azure cloud, everything from command line.   Steps:      First we will start with installing all preliminary command line utilities. And then we will create function locally to test, debug and then we will deploy it on the azure cloud function app. Install npm as most of azure utilities written in node, so to download all dependencies we will require npm. sudo apt install npm Install python3 pip, virtual env with few other python development dependencies as we gonna develop our first app in python. sudo apt install -y python3-pip python3-venv build-essential libssl-dev libffi-dev python3-dev Install azure function command line utility to initialize, debug and publish functions and azure cli to interact with azure cloud. sudo npm install -g azure-functions-core-tools func --version cu

AWS EC2 AMI Back up Script

Goal: If you looking for an ready made working script to complete back your ec2 servers then these article is there to help you out with working shell script code. Steps: Install aws cli. (I am adding commands for Linux/macOS system, for other env please follow through link share in each steps) $ python -m pip install awscli Detailed steps for various env can be found here . To check version after installation run  $ aws --version Setup Credentials on your machine using below command and try to follow steps. $ aws configure More details can be found here .  Once Everything has been setup. you can copy code from below gist. It takes few input.  Elastic ip of server How long you want to keep your old ami Server name, that's nothing but what initial name you want to keep for your backup ami. Hope this article helped you, Lemme know if you find any difficulty using it will try to help it out ASAP.

Prometheus Installation

Goal : This blog will help you with prometheus installation. Installation Steps. Create user without a home directory. sudo useradd --no-create-home --shell /bin/false prometheus Create directories to copy prometheus config and library files and give permission to user that you crated. sudo mkdir /etc/prometheus sudo mkdir /var/lib/prometheus sudo chown prometheus:prometheus /etc/prometheus sudo chown prometheus:prometheus /var/lib/prometheus Download and unzip prometheus from github. curl -LO https://github.com/prometheus/prometheus/releases/download/v2.3.2/prometheus-2.3.2.linux-amd64.tar.gz   tar -xvf prometheus-2.3.2.linux-amd64.tar.gz mv prometheus-2.3.2.linux-amd64 prometheus-files Copy prometheus binary files to bin directory and give permission. sudo cp prometheus-files/prometheus /usr/local/bin/ sudo cp prometheus-files/promtool /usr/local/bin/ sudo chown prometheus:prometheus /usr/l

Useful MongoDB commands

To export JSON from MongoDB Collection   mongoexport -h replicaSet/primaryHost,secondaryHost -u username -p pwd --db dbName -c collectionName --quiet > collectionExport.json To export BSON from MongoDB Collection mongodump -h replicaSet/primaryHost,secondaryHost -u username -p pwd --db dbName -c collectionName -o - > collectionDump.bson To export some fields in csv from MongoDB Collection mongoexport --host primaryHost --db dbName --collection collectionName --fields 'field1,field2,field3' --out collection.csv --csv -u userName -p To run javascript on MongoDB mongo replicaSet/primaryHost,secondaryHost/dbName -u userName mongoScript.js -p To restore bson in MongoDB mongorestore --host primaryHost --db dbName --collection collectionName -u userName -p --drop  collectionName .bson

Curator

Goal: In these tutorial we gonna cover deletion of old logs in ELK Stack. We gonna achive these by deleting old indices created by Logstash while dumping logs in Elasticsearch. Prerequisites: Old logs to delete... 😜😜 Let's Begin the exercise: Install curator Curator is a package in Elasticsearch  repository to delete old indices. Create a file sudo vi /etc/yum.repos.d/curator.repo paste following lines Save and Exit file Run yum install sudo yum install elasticsearch-curator Configure Curator Create a directory mkdir ~/.curator/ Open a file sudo vi ~/.curator/curator.yml paste following code Save and Exit file Deletion pattern Create file to define delete pattern in Elasticesearch sudo vi ~/.curator/delete_indices.yml paste following lines in file Create a log file for curator on the location you defined in configuration, and assign permission to right into file. sudo touch /var/log/curator #to assign permission to write l

Elastalert

Goal Trigger alert on elastic search stream.   Install Elastalert sudo yum install gcc sudo pip install elastalert sudo yum install git Just to get basic elastalert rules reference clone following git repository. git clone https://github.com/Yelp/elastalert.git example_elastalert Create your own rules sudo mkdir -p /etc/elastalert/rules_folder/ sudo cp ~/example_elastalert/example_rules/example_frequency.yaml /etc/elastalert/rules_folder/frequency.yaml sudo cp ~/example_elastalert/config.yaml.example /etc/elastalert/config.yaml Configure Elastalert sudo vi /etc/elastalert/config.yaml Search for 'rules_folder', 'es_host' and change values in file rules_folder: "/etc/elastalert/rules_folder" es_host: localhost Now Open File  and change conf sudo vi /etc/elastalert/rules_folder/frequency.yaml filter: - term: _type: "apache-error" - query: query_string: query: "host:HostName" alert: - "email" include: [

Install Central Logging on Amazon Linux

Goal: In these tutorial we gonna cover setup of central logging system on amazon linux (CentOs) in same aws vpc . We will setup one central log server to receive log using rsyslog, after that we will setup one client to forward apache & syslog to central server. we already covered forward logs from central log server to ELK stack for analyzing. Logging Stack Component: Central Log server Multiple logging client server/Any apache web server generating logs Rsyslog: we setup with rsyslog v8-stable. You can use any rsyslog  version after rsyslog-6, because we encountered rsyslog drop message in earlier version. Prerequisites: Rsyslog is quite light weight, we doesn't requirement any high configuration machine, aws t2.micro should be enough. We are running t2.micro in production for central log server to receive around 1000 log entry/second, server is using less then 2 percent/sec within same vpc. Now Let's Start we gonna break these tutorial in two pa