Skip to main content

Posts

Showing posts from September, 2012

Log aggregation with Logstash, Elasticsearch, Graylog 2 and more Part One

Setup, problem and solution design.
Purpose of log aggregation is to develop single point of access for servers data (in our case nginx web servers).
We have a lot of web servers writing off huge amount of log and no real way to understand what is going on there. Initial solution was to have each systems write a local log file with a Munin agent with custom Perl parser transferring data to Munin server there it was displayed as an RRDtool graph. It worked, however servers themselves generated a lot of logged data making it impossible to parse close to real time forcing us to drop out significant amount of data.

After making a small internet research and due to budget constrains we decided to go with open source tools only. Those applications however still had to be high volume, high load, scalable and big data supporting.
We have decided to setup a dedicated loghost and ship all the data to it parsing it on spot to a needed results. Another thing our proposed solution took into consid…

Setting up Amazon AWS EC2 ftp server with Linux and VSFTP:

Install vsftp (example for Ubuntu / Debian)

apt-get -y install vsftpd

Edit configuration file (in our example with local authentication and no guest user)

vi /etc/vsftpd.conf

write_enable=YES
anonymous_enable=NO
local_umask=022
local_enable=YES


#to add passive ftp:
pasv_enable=YES
pasv_max_port=12100
pasv_min_port=12000
port_enable=YES
pasv_address="your external instance ip or address"


and open inbound port range 20-21 and 12000-12100 in your security groups

Provided by: ForthScale systems, scalable infrastructure experts