Moving out

Hello there, and thank you for reading my blog! I will no longer be updating this site since I’ve relocated the blog. My new blog is self-hosted and is located at www.villepaasonen.com. See you there!

-Ville

Tagged , , , , ,

Assignment 7: Server performance optimization, Varnish and Firefox add-ons

This week’s assignment was to measure the speed difference between a local computer WordPress page and a similar static page, install Varnish, then measure the performance of a dynamic web page before and after the installation to gain data to possibly improve the performance further.

I had LAMP preinstalled on my system so I’m going to skip going through LAMP installing; if you’re interested in installing LAMP, check out my previous posts on this blog. Just to be sure that Apache is working correctly, I typed localhost into the address bar of my web browser. Everything works well, so I moved on with the assignment.

First, I ran the following command:

ab -r -c 500 -n 1000 http://localhost/wordpress/

This command utilizes the ‘ab’ tool which is Apache’s own benchmarking tool for measuring performance. -c stands for the number of clients and -n stands for the number of requests to the server. ‘ab’ is one of the apache2 utilities installed with the command sudo apt-get install apache2-utils. The command above in this case represents the scenario, where 500 clients send a 1000 requests.

This is ApacheBench, Version 2.3
Copyright 1996 Adam Twiss, Zeus Technology Ltd, http://www.zeustech.net/
Licensed to The Apache Software Foundation, http://www.apache.org/

Benchmarking localhost (be patient)
Completed 100 requests
Completed 200 requests
Completed 300 requests
Completed 400 requests
Completed 500 requests
Completed 600 requests
Completed 700 requests
Completed 800 requests
Completed 900 requests
Completed 1000 requests
Finished 1000 requests

Server Software: Apache/2.4.6
Server Hostname: localhost
Server Port: 80

Document Path: /wordpress/
Document Length: 9419 bytes

Concurrency Level: 500
Time taken for tests: 29.028 seconds
Complete requests: 1000
Failed requests: 0
Write errors: 0
Total transferred: 9675000 bytes
HTML transferred: 9419000 bytes
Requests per second: 34.45 [#/sec] (mean)
Time per request: 14513.998 [ms] (mean)
Time per request: 29.028 [ms] (mean, across all concurrent requests)
Transfer rate: 325.49 [Kbytes/sec] received

Connection Times (ms)
min mean[+/-sd] median max
Connect: 0 104 405.4 13 3004
Processing: 885 6844 7472.1 3348 29009
Waiting: 761 6735 7502.7 3221 29005
Total: 901 6948 7442.8 3430 29022

Percentage of the requests served within a certain time (ms)
50% 3430
66% 4140
75% 6633
80% 15705
90% 16043
95% 28886
98% 29000
99% 29009
100% 29022 (longest request)

There were 34 requests per second on average.

Next, we shall try the same amount of requests targeted to only one article on WordPress:

ab -r -c 500 -n 1000 http://localhost/wordpress/2014/04/just-testin/

The article was created only for testing purposes in my previous assignment.

This is ApacheBench, Version 2.3
Copyright 1996 Adam Twiss, Zeus Technology Ltd, http://www.zeustech.net/
Licensed to The Apache Software Foundation, http://www.apache.org/

Benchmarking localhost (be patient)
Completed 100 requests
Completed 200 requests
Completed 300 requests
Completed 400 requests
Completed 500 requests
Completed 600 requests
Completed 700 requests
Completed 800 requests
Completed 900 requests
Completed 1000 requests
Finished 1000 requests

Server Software: Apache/2.4.6
Server Hostname: localhost
Server Port: 80

Document Path: /wordpress/2014/04/just-testin/
Document Length: 11197 bytes

Concurrency Level: 500
Time taken for tests: 28.989 seconds
Complete requests: 1000
Failed requests: 285
(Connect: 0, Receive: 95, Length: 95, Exceptions: 95)
Write errors: 0
Total transferred: 10415645 bytes
HTML transferred: 10133285 bytes
Requests per second: 34.50 [#/sec] (mean)
Time per request: 14494.691 [ms] (mean)
Time per request: 28.989 [ms] (mean, across all concurrent requests)
Transfer rate: 350.87 [Kbytes/sec] received

Connection Times (ms)
min mean[+/-sd] median max
Connect: 0 234 437.6 11 3003
Processing: 83 6017 7581.1 2471 28974
Waiting: 0 5563 7742.3 1875 28958
Total: 93 6251 7511.3 2797 28984

Percentage of the requests served within a certain time (ms)
50% 2797
66% 4524
75% 5274
80% 9558
90% 15764
95% 28828
98% 28926
99% 28962
100% 28984 (longest request)

The speed seems to be at the same level as before, 34-35 requests per second. Next, we shall try the same to a local static website:

ab -r -c 500 -n 1000 http://localhost/~ville/index.html

This is ApacheBench, Version 2.3
Copyright 1996 Adam Twiss, Zeus Technology Ltd, http://www.zeustech.net/
Licensed to The Apache Software Foundation, http://www.apache.org/

Benchmarking localhost (be patient)
Completed 100 requests
Completed 200 requests
Completed 300 requests
Completed 400 requests
Completed 500 requests
Completed 600 requests
Completed 700 requests
Completed 800 requests
Completed 900 requests
Completed 1000 requests
Finished 1000 requests

Server Software: Apache/2.4.6
Server Hostname: localhost
Server Port: 80

Document Path: /~ville/index.html
Document Length: 104 bytes

Concurrency Level: 500
Time taken for tests: 0.464 seconds
Complete requests: 1000
Failed requests: 0
Write errors: 0
Total transferred: 373000 bytes
HTML transferred: 104000 bytes
Requests per second: 2154.29 [#/sec] (mean)
Time per request: 232.096 [ms] (mean)
Time per request: 0.464 [ms] (mean, across all concurrent requests)
Transfer rate: 784.72 [Kbytes/sec] received

Connection Times (ms)
min mean[+/-sd] median max
Connect: 0 4 4.8 0 13
Processing: 2 87 141.4 12 449
Waiting: 2 87 141.4 12 449
Total: 9 91 145.0 12 458

Percentage of the requests served within a certain time (ms)
50% 12
66% 14
75% 232
80% 236
90% 240
95% 455
98% 457
99% 457
100% 458 (longest request)

This process was much faster than the measurements before and it took only a few seconds to complete. The speed was over 2150 requests per second. The next thing to do is to install Varnish. Varnish is an HTTP accelerator designed for content-heavy dynamic web sites. The command to install Varnish is the following: sudo apt-get install varnish -y.

Varnish needs a bit of adjustments before working correctly. First, modify the Apache ports configuration file with sudoedit /etc/apache2/ports.conf. The first not commented line says Listen 80, add another ’80’ to the line so it says ‘Listen 8080’.

Next, edit the Varnish files located in the /etc/ folder with sudoedit /etc/default/varnish. Find the uncommented line that says “DAEMON_OPTS=”-a :6081 \” and modify it to look like the following: DAEMON_OPTS=”-a :80 \. Save the changes and restart Apache and Varnish:

sudo service apache2 restart
sudo service varnish restart

Let’s try to stress localhost/wordpress again with the same command as used before:

ab -r -c 500 -n 1000 http://localhost/wordpress
This is ApacheBench, Version 2.3
Copyright 1996 Adam Twiss, Zeus Technology Ltd, http://www.zeustech.net/
Licensed to The Apache Software Foundation, http://www.apache.org/

Benchmarking localhost (be patient)
Completed 100 requests
Completed 200 requests
Completed 300 requests
Completed 400 requests
Completed 500 requests
Completed 600 requests
Completed 700 requests
Completed 800 requests
Completed 900 requests
Completed 1000 requests
Finished 1000 requests

Server Software: Apache/2.4.6
Server Hostname: localhost
Server Port: 80

Document Path: /wordpress
Document Length: 309 bytes

Concurrency Level: 500
Time taken for tests: 0.097 seconds
Complete requests: 1000
Failed requests: 0
Write errors: 0
Non-2xx responses: 1000
Total transferred: 614990 bytes
HTML transferred: 309000 bytes
Requests per second: 10287.85 [#/sec] (mean)
Time per request: 48.601 [ms] (mean)
Time per request: 0.097 [ms] (mean, across all concurrent requests)
Transfer rate: 6178.64 [Kbytes/sec] received

Connection Times (ms)
min mean[+/-sd] median max
Connect: 3 15 4.6 16 21
Processing: 4 24 10.7 23 51
Waiting: 3 21 8.6 20 45
Total: 20 39 9.9 38 68

Percentage of the requests served within a certain time (ms)
50% 38
66% 45
75% 49
80% 50
90% 51
95% 52
98% 54
99% 57
100% 68 (longest request)

The process completed with extreme speed and over 10000 requests per second, implying a massive growth of speed. Let’s try stressing the test article in WordPress with ab -r -c 500 -n 1000 http://localhost/wordpress/2014/04/just-testin/:

This is ApacheBench, Version 2.3
Copyright 1996 Adam Twiss, Zeus Technology Ltd, http://www.zeustech.net/
Licensed to The Apache Software Foundation, http://www.apache.org/

Benchmarking localhost (be patient)
Completed 100 requests
Completed 200 requests
Completed 300 requests
Completed 400 requests
Completed 500 requests
Completed 600 requests
Completed 700 requests
Completed 800 requests
Completed 900 requests
Completed 1000 requests
Finished 1000 requests

Server Software: Apache/2.4.6
Server Hostname: localhost
Server Port: 80

Document Path: /wordpress/2014/04/just-testin/
Document Length: 11197 bytes

Concurrency Level: 500
Time taken for tests: 0.541 seconds
Complete requests: 1000
Failed requests: 0
Write errors: 0
Total transferred: 11566990 bytes
HTML transferred: 11197000 bytes
Requests per second: 1847.08 [#/sec] (mean)
Time per request: 270.697 [ms] (mean)
Time per request: 0.541 [ms] (mean, across all concurrent requests)
Transfer rate: 20864.41 [Kbytes/sec] received

Connection Times (ms)
min mean[+/-sd] median max
Connect: 1 10 4.9 13 18
Processing: 8 252 219.3 446 520
Waiting: 5 245 220.7 445 515
Total: 21 262 222.9 458 536

Percentage of the requests served within a certain time (ms)
50% 458
66% 468
75% 486
80% 491
90% 503
95% 516
98% 519
99% 522
100% 536 (longest request)

About 1850 requests per second which tells of a major improvement in speed comparing to the previous results (34,50 requests per second). Finally, let’s test the localhost static test site with ab -r -c 500 -n 1000 http://localhost/~ville/index.html.

This is ApacheBench, Version 2.3
Copyright 1996 Adam Twiss, Zeus Technology Ltd, http://www.zeustech.net/
Licensed to The Apache Software Foundation, http://www.apache.org/

Benchmarking localhost (be patient)
Completed 100 requests
Completed 200 requests
Completed 300 requests
Completed 400 requests
Completed 500 requests
Completed 600 requests
Completed 700 requests
Completed 800 requests
Completed 900 requests
Completed 1000 requests
Finished 1000 requests

Server Software: Apache/2.4.6
Server Hostname: localhost
Server Port: 80

Document Path: /~ville/index.html
Document Length: 104 bytes

Concurrency Level: 500
Time taken for tests: 1.527 seconds
Complete requests: 1000
Failed requests: 0
Write errors: 0
Total transferred: 392990 bytes
HTML transferred: 104000 bytes
Requests per second: 655.00 [#/sec] (mean)
Time per request: 763.360 [ms] (mean)
Time per request: 1.527 [ms] (mean, across all concurrent requests)
Transfer rate: 251.38 [Kbytes/sec] received

Connection Times (ms)
min mean[+/-sd] median max
Connect: 1 11 4.2 12 18
Processing: 3 196 433.8 24 1510
Waiting: 2 193 435.2 16 1510
Total: 13 208 433.4 35 1525

Percentage of the requests served within a certain time (ms)
50% 35
66% 41
75% 43
80% 50
90% 1166
95% 1521
98% 1523
99% 1524
100% 1525 (longest request)

Averaging 655 requests per second once again indicates a major improvement. The most interesting observation before and after using Varnish is the fact that a simple, local, static website with just some contents took the most time to complete the requests with greatly lower speed comparing to local WordPress sites.

Next, we shall try Firebug. It is a popular, useful set of different web development tools for Firefox. It is installed by opening Firefox, going to Tools –> Add-ons –> Insert ‘Firebug’ into the search bar and install. Click the new Firebug button next to the URl bar and select Firebug UI location –> Detached. After opening the UI, go to the Net panel and enable it. Then, openlocalhost/wordpress in Firefox and go to the Net panel:

firebug localhost



As the image shows, it took 53 milliseconds to load the page and a few milliseconds more to load the CSS, scripts and other parts of the page. The total time for all seven requests was 127 milliseconds (onload: 278 milliseconds). There unfortunately aren’t too many hints for improvement here. Let’s see if Yslow will offer them.

Yslow is a tool that analyzes web pages and why they’re slow based on Yahoo!’s rules for high performance web sites. It is installed the same way as Firebug.

Yslow is accessed via Firebug UI, under the ‘Yslow’ tab. Run the test when the selected website is open, in this case localhost/wordpress. The site gets an overall rating of B (A-F) and overall performance score of 90/100, with every section being grade A except the following:

Grade F on Use a Content Delivery Network (CDN)

There are 6 static components that are not on CDN.

Grade F on Add Expires headers

There are 7 static components without a far-future expiration date.

Luckily there are no problems besides the static components.

That’s about it! This was the final assignment of the course. The course was really challenging and educational but also very fun and interesting. Hopefully you readers also gained some new information and/or learned new skills. See you!

Tagged , , , , , , , , ,

Assignment 6: Installing customized WordPress and Drupal on Xubuntu 13.10 Saucy Salamander

This week`s assignment was to install WordPress locally on LAMP and apply a few customizations for the software.

I was previously working on my Xubuntu 12.04 laptop when doing these assignments. Now that an era has ended and the Windows XP is no longer supported and updated, I decided to revive my old desktop beast with Xubuntu 13.10. The computer was built in late 2007 and it is running like a champion with Xubuntu 13.10. I can highly recommend doing the same to your old XP computers.

I had LAMP preinstalled on my system so I’m going to skip going through LAMP installing; if you’re interested in installing LAMP, check out my previous posts on this blog. Just to be sure that Apache is working correctly, I typed localhost into the address bar of my web browser. Everything works well, so I moved on with the assignment.

I alos had previously enabled virtual hosting locally so I built the document path on top of the virtual host.

Remember, the following actions won’t succeed if LAMP isn’t already installed on your system!

The first thing to do is to download the latest version of the WordPress software. This happens by typing wget http://wordpress.org/latest.tar.gz into the Terminal. The downloaded tarball will be extracted with tar -xzvf latest.tar.gz.

Now that the files are extracted in the right place, a database must be created for using WordPress. The following commands in the Terminal Emulator will create the database, a user and set the privileges to access the database.

ville@Ville-Xubuntu:~$ mysql -u root -p
 Enter password:
 Welcome to the MySQL monitor.  Commands end with ; or \g.
 Your MySQL connection id is 45
 Server version: 5.5.35-0ubuntu0.13.10.2 (Ubuntu)
Copyright (c) 2000, 2013, Oracle and/or its affiliates. All rights reserved.
Oracle is a registered trademark of Oracle Corporation and/or its
 affiliates. Other names may be trademarks of their respective
 owners.
Type 'help;' or '\h' for help. Type '\c' to clear the current input statement.
mysql> CREATE DATABASE wordpress;
 Query OK, 1 row affected (0.00 sec)
mysql> CREATE USER ville@localhost;
 Query OK, 0 rows affected (0.00 sec)
mysql> SET PASSWORD FOR ville@localhost= PASSWORD("insert_password_here");
 Query OK, 0 rows affected (0.04 sec)
mysql> GRANT ALL PRIVILEGES ON wordpress.* TO ville@localhost IDENTIFIED BY "insert_password_here";
 Query OK, 0 rows affected (0.00 sec)
mysql> FLUSH PRIVILEGES;
 Query OK, 0 rows affected (0.00 sec)
mysql> exit
 Bye

Now it is time to set the WordPress configurations. First run the commands:

cp ~/wordpress/wp-config-sample.php ~/wordpress/wp-config.php
 sudo nano ~/wordpress/wp-config.php

The latter command opens the WordPress configuration file. Modify the highlighted lines in the file to resemble the following:

?php
 /**
 * The base configurations of the WordPress.
 *
 * This file has the following configurations: MySQL settings, Table Prefix,
 * Secret Keys, WordPress Language, and ABSPATH. You can find more information
 * by visiting {@link http://codex.wordpress.org/Editing_wp-config.php Editing
 * wp-config.php} Codex page. You can get the MySQL settings from your web host.
 *
 * This file is used by the wp-config.php creation script during the
 * installation. You don't have to use the web site, you can just copy this file
 * to "wp-config.php" and fill in the values.
 *
 * @package WordPress
 */
// ** MySQL settings - You can get this info from your web host ** //
 /** The name of the database for WordPress */
 define('DB_NAME', 'wordpress');
/** MySQL database username */
 define('DB_USER', 'ville');
/** MySQL database password */
 define('DB_PASSWORD', 'YOUR_PASSWORD');
/** MySQL hostname */
 define('DB_HOST', 'localhost');
/** Database Charset to use in creating database tables. */
 define('DB_CHARSET', 'utf8');
/** The Database Collate type. Don't change this if in doubt. */
 define('DB_COLLATE', '');
/**#@+
 * Authentication Unique Keys and Salts.
 *
 * Change these to different unique phrases!
 * You can generate these using the {@link https://api.wordpress.org/secret-key/1.1/salt/ WordPress.org s$
 * You can change these at any point in time to invalidate all existing cookies. This will force all user$
 *
 * @since 2.6.0
 */
 define('AUTH_KEY',         'put your unique phrase here');
 define('SECURE_AUTH_KEY',  'put your unique phrase here');
 define('LOGGED_IN_KEY',    'put your unique phrase here');
 define('NONCE_KEY',        'put your unique phrase here');
 define('AUTH_SALT',        'put your unique phrase here');
 define('SECURE_AUTH_SALT', 'put your unique phrase here');
 define('LOGGED_IN_SALT',   'put your unique phrase here');
 define('NONCE_SALT',       'put your unique phrase here');
/**#@-*/
/**

Insert the name of the database that was created earlier on the first highlighted line. For the user and password lines, just choose the ones you want to use in the future. Make sure that the password is strong enough and contains at least lower and capital case letters, numbers and symbols.

Let’s copy the WordPress folder to the /var/www/ folder with the commands:

sudo cp -R wordpress/ /var/www/
 cd /var/www/

Next, set the user rights and ownership to the Apache user:

sudo chown ville:www-data /var/www/ -R
 sudo chmod g+w /var/www/ -R

Let’s type http://localhost/wordpress into a web browser’s address bar. A page about the details of the opening WordPress installation should appear like below:



 

 

 

 

 

Fill in the needed information a click Install. If it completes without errors, WordPress has been successfully installed on your system.

Now that the basic installment is complete we will explore basic customization options.

First, we are going to enable permalinks. Enabling permalinks makes the links of the posts clearer, easier to type and logical.

Go to Settings –> Permalinks on the WordPress control UI. You can choose whatever type of link you like, but I chose Month and name. This makes all the links of new posts show the name of the post and the month and year of the time of postage.Save the settings and exit.

Now, make a test article and publish it.

The URL is the one it is supposed to be, but a 404 error appears. To fix this, edit a new file called .htaccessto the same folder where the other WordPress files reside and insert the following into the file contents:

<IfModule mod_rewrite.c>
RewriteEngine On
RewriteBase /
RewriteRule ^index\.php$ - [L]
RewriteCond %{REQUEST_FILENAME} !-f
RewriteCond %{REQUEST_FILENAME} !-d
RewriteRule . /index.php [L]
</IfModule>

If this doesn’t work, we need to edit a few settings to succeed. First, go to the WordPress folder and set new rights for the .htaccess file:

 cd /var/www/wordpress
 sudo touch .htaccess
 sudo chmod 666 .htaccess
 sudo chown www-data:www-data .htaccess

Then, make sure that the a2enmod rewritemodule is enabled and restart Apache:

sudo a2enmod rewrite
 sudo service apache2 restart

Finally, edit a row in the apache2.conffile:

cd /etc/apache2/
 sudoedit apache2.conf

Find the lines that say:

Options Indexes FollowSymLinks
AllowOverride None
Require all granted

and change the contents to –> AllowOverride All. After this, restart Apache again with sudo service apache2 restart. Permalinks should work now:

permalinks work








Changing the WordPress theme doesn’t require any commands via the Terminal. Go to Appearance –> Themes, choose a fitting new theme and save the changes. That should change the graphical appearance of your WordPress site.

It is always important to back up your files, so let’s do that next. This happens by going to Tools –> Export –> All contentin the control interface. Next, select Download export file to download the WordPress contents to your local folder. The downloaded contents are in .xml format.

Lastly, I installed  Drupal onto LAMP. Drupal is a popular, open source content management framework. First, let’s download Drupal 7.25 with:

wget http://ftp.drupal.org/files/projects/drupal-7.25.tar.gz
tar -xvzf drupal-7.25.tar.gz
sudo mkdir /var/www/drupal
sudo mv drupal-7.25/* drupal-7.25/.htaccess drupal-7.25/.gitignore /var/www/drupal

These commands download the Drupal tarball, open and extract the tarball, make a new folder for the new files and move the downloaded files into the new folder located in /var/www/drupal. Let’s create a few subdirectories for new media files to be added later to the new site:

sudo mkdir /var/www/drupal/sites/default/files
sudo chown www-data:www-data /var/www/drupal/sites/default/files

Next up is creating a configuration file for the new default site:

sudo cp /var/www/drupal/sites/default/default.settings.php /var/www/drupal/sites/default/settings.php
sudo chown www-data:www-data /var/www/drupal/sites/default/settings.php

After this, create a new database and user for Drupal on MySQL with:

mysqladmin -u root -p create drupal 
mysql -u root -p
GRANT SELECT, INSERT, UPDATE, DELETE, CREATE, DROP, INDEX, ALTER, CREATE TEMPORARY TABLES, LOCK TABLES ON drupal.* TO 'YOUR_USERNAME'@'localhost' 
IDENTIFIED BY 'INSERT_PASSWORD_HERE'; 
FLUSH PRIVILEGES;

Restart Apache with sudo service apache2 restartand open your web browser. Type localhost/drupal/install.phpinto the address bar. Create an account, login and follow the further instructions to completye the installation. Make sure you insert the previously created, correct database and user data into the data fields required. The first created account will be the administrating account, so make sure to make it secured as well as possible.

I also installed WordPress onto my DigitalOcean VPS running Xubuntu 12.04 with the same instructions with the exception of AllowOverrideresiding in another folder, /etc/apache2/sites-available/default.

 

 

 

Tagged , , , , , ,

Assignment 5: Name-based virtual hosting (VPS) with Apache

This week’s assignment was to get a name-based virtual hosting working with Apache and try a virtual private server (VPS) from Amazon, Linode, DigitalOcean, or another similar internet hosting service.

Virtual hosting is basically running and controlling multiple (unlimited amount) domains via only one IP address. It’s really handy since not many systems or servers are needed for administrating multiple domains.

I started by purchasing a VPS from DigitalOcean, which was affordable (starting at 5 U.S dollars or roughly 3,8 € per month) and seemed to be a proper choice overall. I named the host villepaasonen.com and set the VPS to run Ubuntu 12.04 LTS. The creation process was fast and was completed without errors. I got the root username and password via e-mail and I was ready to log in. I opened my Xubuntu 12.04 laptop and updated the package lists and the system. Next, I installed Apache and logged onto the VPS via SSH:

sudo apt-get update
sudo apt-get upgrade
sudo apt-get install apache2
ssh root@server ip address

I was logged in properly after inserting the username and password, so now I know that the server droplet is ready to be used. I also typed in the IP address to the URl address bar and the “example homepage” was displayed properly.

I had LAMP preinstalled on my laptop so I’m going to skip going through LAMP installing; if you’re interested in installing LAMP, check out my previous posts on this blog. Just to be sure that Apache is working correctly, I typed localhost into the address bar of my web browser. Everything works well, so I moved on with the assignment.

First, I created a directory for the information of the new website and the actual website with the commands:

sudo mkdir -p /var/www/villepaasonen.com/public_html
sudo nano /var/www/villepaasonen.com/public_html/index.html

Into the HTML file I inserted the following:

<!doctype html>
<html>
<head>
<title>www.villepaasonen.com</title>
</head>
<body>
<h1>Success: You have set up a virtual host.</h1>
</body>
</html>

I needed to adjust the user rights since I don’t want to use the VPS as root only. The commands:

sudo chown -R $USER:$USER /var/www/villepaasonen.com/public_html
sudo chmod -R 755 /var/www

These commands set the user rights in a way that all newly added and connected users may read the contents of the public files. What I did after that was set up the Apache configuration and turn on the virtual hosts:

sudo cp /etc/apache2/sites-available/default /etc/apache2/sites-available/villepaasonen.com
sudo nano /etc/apache2/sites-available/villepaasonen.com

I typed the following into the new config file under the ServerAdmin line:

ServerName villepaasonen.com

This specifies the current domain name for the virtual host. Next, I modified the virtual host file for another URL access and inserted the following into the file:

<VirtualHost *:80>
ServerAdmin webmaster@villepaasonen.com
ServerName villepaasonen.com
ServerAlias http://www.villepaasonen.com
[…]

This enables another URL access to the domain. It previously only accepted villepaasonen.com but now it allows the use of www.villepaasonen.com also. It also sets the admin user of the server.

After doing that, I set up the correct document root by inserting the following:

DocumentRoot /var/www/villepaasonen.com/public_html

This is really important because without the proper document root it’s not possible to set up the virtual host. Then it was time to activate the actual host with sudo a2ensite villepaasonen.com.

To complete the modifications and changes, Apache needed to be restarted with sudo service apache2 restart. Now virtual hosting is enabled and ready for use.

Next assignment coming up soon!

Assignment 4: Metapackages, repositories and scripts

Assignment number four was to create a metapackage of chosen software, create a repository with reprepro and packaging a shell script. First of all a program called equivs needed to be installed. Equivs is a downloadable package for creating Debian packages. Before that, I updated the package lists:

sudo apt-get update sudo apt-get install equivs

Equivs was installed quite quickly and with no problems, so I continued with creating the metapackage. I decided to make a useful basic metapackage for IT students, like an enhanced LAMP package which installs the “essential software” for studying:

  • LibreOffice
  • Eclipse
  • Apache server and PhpMyAdmin
  • Blender
  • Dropbox
  • MySQL server and Php5


I started building the metapackage by inserting the needed data by writing in the terminal:

equivs-control villes-studypackage.cfg nano villes-studypackage.cfg

These commands created the base to the metapackage and opened it up for customization. I made adjustments to the contents of the package and it ended looking like this:

package









I only had to make adjustments to a few rows: Package (name of the package without the .cfg file type name), Version (version number of the package), Maintainer (the maintainer and creator of the package) and Depends (names of the programs to be installed). The package was created with no problems or errors, so I decided to continue on and run the package through lintian. Lintian is a software that checks Debian packages for errors and inconsistencies. Lintian was already installed on my Xubuntu 12.04 but it can be installed by the command:

sudo apt-get install lintian

I ran my newly created .deb package through lintian with the command:

lintian villes-studypackage_0.1_all.deb

It ran the package with no errors so it seemed to be a valid and well created metapackage. Next, I installed the package with gdebi:

sudo gdebi villes-studypackage_0.1_all.deb

Gdebi is a basic tool for installing .deb packages. It was pre-installed on my system but if you don’t have it installed, you can install it with:

sudo apt-get install gdebi

The installation process was very fast since I had all the software in the metapackage already installed on my system but it completed with no errors or warnings, so I trust that the metapackage was valid and executed properly.

When the first part was done, I moved on to creating and configuring a new repository with Reprepro, which is a tool for managing APT repositories. First, I installed the program:

sudo apt-get install reprepro

After that, according to Tero Karvinen’s instructions, I ran a few commands on the terminal:

mkdir -p repository/conf
nano repository/conf/distributions

These commands created a new folder repository, another folder within in called conf, and a file called distributions within that folder. Into the distributions file I typed some info about my current operating system version:

Codename: precise
Components: main
Suite: precise
Architectures: i386 amd64 source

Next, I added my previously created metapackage:

reprepro -VVVV -b repository includedeb precise Scripts/villes-studypackage_0.1_all.deb

It created the path files and added the .deb package into the repository without any complaints.

After doing that I needed to edit the repository.list file for installing the repository to clients. I typed into the terminal sudoedit /etc/apt/sources.list.d/repository.list to find the file and into the file I inserted the following line:

deb http://127.0.0.1/~ville/repository precise main

That should have done the job, and I ran sudo apt-get update to update the package repositories. To my surprise, the command failed and the result was a 403 error and a failure to fetch the packages from the address. I think the problem was with my main user’s user rights. I will attend the class tomorrow and ask about the error and how to solve it.

To complete the assignment I was supposed to package a script. I decided to write a quick shell script, which greets the user and tells the local time and date, called localinfo. Here is the content of the script:

clear
echo “Welcome $USER!”
echo “Today is “;date
echo “Calendar”
cal
exit 0

I give the executing rights with:

chmod 755 localinfo

Next, I build the package for the script and edited the contents of the .cfg:

equivs-control localinfo.cfg
nano localinfo.cfg

I edited the contents to look like this:

scriptpackage









Then I built the package with equivs-build localinfo.cfg. 

Finally, I installed the newly created .deb package with gdebi:

sudo gdebi localinfo_0.1_all.deb

The package was installed with no errors, and I executed the script by typing localinfo into the terminal. The results:

scriptrun









That’s it! Next assignment coming up next week.

Sources and help:

Tero Karvinen: Update all your computers with a deb repository: http://terokarvinen.com/2011/update-all-your-computers-with-a-deb-repository

Tero Karvinen: Create deb metapackage in 5 minutes: http://terokarvinen.com/2011/create-deb-metapackage-in-5-minutes

Tagged , , , , , , , ,

Assignment 3: Solving a Honeypot security breach and digging into OWASP Top 10

This week’s assignment was to solve and explain thoroughly one of the ten most usual security breach types from OWASP Top 10 and solve HoneyNet Scan of the Month 15. Before we get into the actual assignment, I highly recommend taking caution if you are using the disk images downloaded from HoneyNet, since they contain actual malware. They should be dealed with using VirtualBox or a live CD or memory stick and any executable files found on the downloaded disk image shouldn’t be run in any case.

I started by downloading the tar ball from HoneyNet. I extracted it into my home directory with the tar command:

mkdir HoneyNet
tar -xf honeynet.tar.gz

This extracted all the files from the tar ball to a disk image into the folder.

Next, I installed Sleuthkit, which is a set of tools for forensic analysis of the system. I didn’t want to mount the disk image on the system, but luckily that wasn’t required. I also created a folder for the recovered files.

sudo apt-get install sleuthkit
mkdir allocated deleted

Then, I ran two commands on the terminal that recovered important files from the disk image to the previously created folders allocated and deleted. 

tsk_recover -a honeypot.hda8.dd allocated/
tsk_recover honeypot.hda8.dd deleted/

Honeypot.hda8.dd is the downloaded honeypot disk image to be clear. The first command recovered 1614 files and the second command recovered 37 files.

Next up, a timeline was needed for the files. What we know is that the honeypot system had been compromised on 15th of March of 2001. It was necessary to find the files that have been modified at that date or nearby.

tsk_gettimes honeypot.hda8.dd > rawtimes
mactime -b rawtimes|less

These commands tie the files to their respective date and print out the events and tell, what has actually happened during that time. To find the events happened during 15th of March, 2001, I typed in the terminal:

/Mar 15 2001

There were particularly suspicious events which had happened earlier, starting at 22nd of October, 2000. A group of files starting with $OrphanFiles/OrphanFile-20** had been deleted from the system. These events had happened on a few random days, stopping at 3rd of March, 2001.

Now that I had found suspicious activity, I wanted to actually check the allocated and deleted files to find more information. I typed

ls -l allocated/ deleted/

in the terminal and the following showed up:

Screenshot - 02212014 - 08-44-14 PM










As you can see, there was a folder called $OrphanFiles and an alarming tar ball name lk.tgz. I moved to Downloads/honeynet/deleted/ and opened the tar ball with:

tar -xf lk.tgz

It opened two more folders, etc and last. I checked the contents of last with ls and it showed very interesting things.

I decided to check the contents of some of the files. Many of them were binary files so I didn’t open them, and you shouldn’t do that either. The files cleaner and install showed some things that made me pretty sure that I have located the rootkit that had been installed on the honeypot system.

Screenshot - 02212014 - 09-10-36 PM










Here is a screencap of the cleaner file. It is an executable shell script which cleans the logs of the system. The cracker has used this script to cover up the traces of him attacking the system. “Alles sauber mein Meister!” is German and it translates to “everything clean, my Master!”.

1

2



















These are the screenshots of the install file. You can spot many lines that show that we are talking about a rootkit. The English sentences are clearly typed by a non-native English speaker and the foreign language sentences are in Romanian, according to Google Translate.

I have to admit that this assignment was very interesting to complete. It didn’t only teach how to recover files and do some forensic research, but also that pretty much the only way to make 100% sure that the files you wanted to delete were really deleted is to take a hammer and smash the hard drive into pieces.

This assignment also managed to creep me out quite a bit for some reason, even though the cracker clearly wasn’t really good at what he did. I’m glad I’m so pedantic when it comes to information security.

I used My teacher Tero Karvinen’s web article “Forensic File Recovery with Linux” for help in the assignment.

To finish things up, I read up on Insecure Direct Object References at OWASP.org.

What is an Insecure Direct Object Reference?

Insecure Direct Object Reference is a type of information security attack concerning web applications and services with multiple users with secured personal data of some sorts, such as web forums and web bank applications.

How does it happen?

For example, a web bank application uses a user’s account number to refer to a specific bank account. The attacker, another user of that same web bank app, notices that his/her own bank account number mediates as a parameter when the account info query is being executed during logging in. He/she changes the value of that parameter to another user’s account number and the application reveals another user’s account info etc. This could happen by inserting certain data to an info or search field. The attacker could also modify the parameter to redirect the other user to another website for phishing purposes.

Another example is a directory traversal. If the user rights haven’t been managed properly, it is possible that just by typing another ending to an URl, the supposed-to-be-secret files could be rendered, revealing important account data.

How can I prevent it from happening?

The most important thing is to keep track and manage the user rights properly. There should be a secured query for every user event to check whether the user has the rights to execute whatever he/she is going to do in the application. There exist so called indirect reference maps for developers and they should be used every time. File names or internal/external URls should never be exposed to the users of the application.

Thanks for reading, next assignment coming up in a few days!

Assignment 2: Log analyzing and stress testing

On this week our assignment was to collect valuable stress data with Munin, some tools such as cpu, mem, io and analyze the data. Lastly we were supposed to make a few lines to some particular log and analyze it too.

I started with updating the system and installing Munin with the commands:

sudo apt-get update
sudo apt-get install munin

The Munin graphs are available via any web browser by inserting file:///var/cache/munin/www/ to the address bar.

Here are some screenshots of the statistics and graphs Munin has collected since installing it:

cpu-day

fw_conntrack-day

memory-day

































The graphs and statistics are a bit lackluster as you can see since it takes quite a long time of usage for Munin to collect actually accurate and reliable data. The graphs show no signs of peculiar activity; average memory usage, low CPU usage and a few spikes in the connections through the firewall when downloading updates etc.

Next, I installed Stress and htop:

sudo apt-get install stress -y
sudo apt-get install htop

Stress does what its name implies, it stresses the system to set content and htop views the processes Linux is running in real time, generating measurable data on the load of the system.

I typed stress in the terminal and it showed the usage options. There was an example line which I used to stress the system:

stress --cpu 8 --io 3 --vm 2 --vm-bytes 128M --timeout 120s

The laptop started started the stressing process and finished with the lines:

stress: info: [13132] dispatching hogs: 8 cpu, 3 io, 2 vm, 0 hdd
stress: info: [13132] succesful run completed in 120s

During the test I ran htop with typing

htop

in the terminal. Screenshot:

Screenshot - 02142014 - 01-37-24 AM











The laptop heated quite a bit during the testing, so there might be something faulty in the cooling system. I will write an article about finding crucial info about the condition of the system later.

To finish the assignment, I ran a few commands to generate data on the log files of the system.

First, I found the actual event log file with the commands:

cd /var/log/
tail -f auth.log

Next, I generated a few lines on purpose to the log file. I logged onto a test user on my system via SSH:

ssh mikkom@localhost

I inserted the password and managed to log in. The log file printed out:

Feb 14 01:49:39 ville-HP-EliteBook-2560p sshd[16928]: Accepted password for 
mikkom from 127.0.0.1 port 44386 ssh2
Feb 14 01:49:39 ville-HP-EliteBook-2560p sshd[16351]: pam_unix(sshd:session): 
session opened for user mikkom by (uid=0)

These lines tell of a successful SSH login. The first line tells the local time and date, the name of the server, the SSH daemon, success/failed status, IP address and the SSH protocol version number. The second line tells the same data but also a message about a new session being opened and/or users being switched.

Next, I failed to log onto my main user via SSH by inserting the wrong password. The log printed out:

ssh ville@localhost
Feb 14 01:50:16 ville-HP-EliteBook-2560p sshd[16928]: pam_unix(sshd:auth): 
authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=localhost  
user=ville
Feb 14 01:50:18 ville-HP-EliteBook-2560p sshd[16928]: Failed password for 
ville from 127.0.0.1 port 44386 ssh2 

These lines tell of a failed SSH login. It tells the local time and date, the name of the server, the SSH daemon, success/failed status and the cause, IP address and the SSH protocol version number.

Next assignment coming up in a few days!

Tagged , , , , , , ,

The first assignment of a new Linux server course

Last week I enrolled on a new Linux server course at Haaga-Helia University of Applied Sciences.

The course is led and lectured by Tero Karvinen. You can find his homepage at http://www.terokarvinen.com.

The first assignment given was to finish the final test of a previous Linux course I attended, Työasemat ja tietoverkot (Workstations and networks). I had Xubuntu 12.04 preinstalled on my HP EliteBook 2560p laptop so I naturally skipped installing and settings, but if you are interested in that, I have previous posts in this blog here.

When the workstation was installed and ready, I had to add users. First I ran an update via the Terminal with the command:

        sudo apt-get update

Next, I created the users for Mikko Mallikas, Maija Mehiläinen, Einari Vähäkäähkä, Ik E, Veijo Miettinen and Håkan Persson. This happens with the terminal commands:

        sudo adduser mikkom
	sudo adduser maijam
	sudo adduser einariv
	sudo adduser ike
	sudo adduser veijom
	sudo adduser hakanp (The letter å isn't worth keeping in a username)

I gave each one a strong password and saved the usernames and passwords in a file in my encrypted home directory with the command:

        nano userinfo.txt

Note that I wasn’t running on the root user and I don’t recommend doing that unless it is absolutely mandatory. Sudo is a command that makes running admin rights on a normal user possible if the password is known, so it isn’t usually necessary to run Xubuntu (or any other Linux version) as root. If being run as root, the sudo command isn’t needed naturally.

Next, I installed the web server to make example homepages for the new users with the command:

        sudo apt-get install apache2

Apache is a commonly used, reliable and stable server software so I decided to download and install it. To try whether it works or not I opened a web browser and typed “localhost” in the address bar. The current text appeared: “It works! This is the default web page for this server. The web server software is running but no content has been added, yet.”

So it seems that Apache was installed properly and it is up and running. I also ran the following commands:

        sudo a2enmod userdir
        sudo service apache2 restart

What these did was enable the mod for later use of php5 for example and restart the server software.

The users had other requests to be installed, such as clients for PHP, MySQL, Python and Java. The commands:

        sudo apt-get install php5
	sudo apt-get install mysql-server
	sudo apt-get install phpMyAdmin
	sudo apt-get install ssh
	sudo apt-get install openssh-server
	sudo apt-get install python-dev

Php5 may need a little adjustments for it to work properly, so I ran a few commands just in case:

        sudo apt-get libapache2-mod-php5
	sudo a2enmod php5
	sudo service apache2 restart

These commands install a mod for php5, enable the mod and restart the server. The conf file of Apache2 must also be adjusted:

        cd /etc/apache2/mods-enabled
        sudoedit php5.conf

Comment the following lines with hashtags at the beginning of the lines:

        # <IfModule mod_userdir.c>
        # <Directory /home/*/public_html>
        # php_admin_value engine Off
        # </Directory>
        # </IfModule>

and restart the server as done before.

User Einari wants to use PHP and run Hello world. First I logged in as Einari via SSH:

        ssh einariv@localhost

I inserted his password, created a new folder in his home directory called public_html and made him a file in which to insert the Hello world php code:

        cd /home/einariv
        sudo mkdir public_html
        cd public_html
        sudo nano helloworld.php

I copied the following in the helloworld.php file:

        <?php echo "Hello php!"; ?>

I ran the code via terminal with the command:

        php helloworld.php

and it printed out “Hello php!”. So now we have php with the needed modules and settings ready to use. Next we are supposed to run Hello world with Python as user Maija:

        ssh maijam@localhost
        cd /home/maijam
        sudo nano helloworld.py

I inserted the following into the Python file and tested the file with:

        print 'Hello world!'
        python helloworld.py (Terminal)

It printed out “Hello world!” so now Python is also installed and ready to use.

Lastly I installed a firewall called ufw for remote access to the computer and enabled it with the commands:

        sudo apt-get install ufw
        sudo ufw enable

SSH uses port 22 for incoming traffic and port 80 is for outgoing traffic so I decided to allow traffic via those ports with the commands:

        sudo ufw allow 22
        sudo ufw allow 80
        sudo ufw default deny

The last command basically sets the deafults settings for ufw to deny/block all connections in and out except the ones I allowed previously, port 22 and port 80. Lastly, the firewall had to be restarted for the settings to take action:

        sudo shutdown -r 0 
        (save all work before executing this command since it shuts down 
        the whole system.)

So that’s it! I’m really happy to use Linux after quite a long time. It seems my previous knowledge had rusted up a bit but completing this assignment really reminded me of what I knew well.

Next assignment coming up next week, stay tuned!

Tagged , , , ,

Working title Lyrical Battle – Upcoming Facebook game

Hello everyone after quite a long time!

I’m currently developing a new kind of social Facebook-embedded game with my friends and fellow ICT students Oskari Viilo and Niko Kiuru of HAAGA-HELIA University of Applied Sciences.

The game’s called Lyrical Battle, and as I mentioned before, it’s quite a unique kind of game, designed primarily for all you productive poets and lyricists.

LyricalBattle

The idea came up while brainstorming with my friends. I and Oskari make music and lyrics on our free time occasionally. We both have the same problem; we love making creative stuff but are pretty unsure about publishing and sharing it, particularly online. While Lyrical Battle might not remove all the unsecurity with publishing your own creations online, it will possibly reduce the threshold to do so.

Lyrical Battle, as the name implies, is about writing lyrics competitively. The idea is quite simple but fun: First, a lobby is generated. The creator gives the lobby a name and a theme and writes the first line of the poem/song/short story. Next, people around the world may join that very lobby and suggest a new line within a time limit. The best line that receives the most votes from other users, wins and becomes the next part of the writing. The complete writing, written by multiple brilliant minds, is downloadable after it has been finished. The intention is also to build a creative community around the game, help people work constructively with each other and encourage themselves to further publish their good stuff. There isn’t anything like this online yet, so we believe we can bring something original, adjunctive and fun to all of you. Hope to see you in the game!

Lyrical Battle will be released in late December, stay tuned for updates!

Tagged , , , , , ,

My personal thoughts and highlights of DigiExpo 2012

I luckily got to visit DigiExpo, which took place 2.-4.11.2012 at The Finnish Exhibition & Convention Centre. It’s the biggest annual IT oriented fair in Finland. I hadn’t visited DigiExpo for a couple of years, and I was really excited to go there after a long time. I was quite pleased with the offerings of the fair this year. Here’s a short report of my personal highlights on this year.

Nokia Lumia 920 and Windows Mobile 8

I’ve been really skeptical with Nokia lately, but since I’ve been able to get my hands on Windows phones, especially Nokia ones, I’ve only got more and more interested in the OS. I’ve had only Android phones for the last few years, but when my package deal with Samsung Galaxy SII ends, I will seriously consider getting a Windows Phone, most likely the Nokia Lumia 920 or whatever Nokia’s flagship phone will be at the time.

The Lumia 920 is filled with so many useful and interesting innovations, it’s really hard not to like the device. The presenter was also really informative about all the new properties of the device and the OS. I only had about five minutes to try it by myself, but I could test all the features that matter to me, and again, I was very, very happy. It was smooth, fast, exciting, innovative and effectual.

The only big flaw of the Lumia 920 is the rather small amount of available applications, but the number of apps in Windows Marketplace is growing rapidly, so the flaw will be rather temporary, at least for me. Of course the app enthusiasts will stick to their iPhones or Androids, but I really hope that consumers will realize the potential of Windows Phone 8.

I feel so happy for Nokia, since they have finally created something really modern, distinguishable and significant. If Nokia goes down, it will go down only because of the ignorance of the consumers. Also, Microsoft has created a great OS, which is a huge improvement over Windows Phone 7. Way to go!

Nintendo Wii U

This upcoming console is something I’ve been hyping since its confirmation. It represents brand new, rather futuristic technology when it comes to gaming. The second screen embedded into the gamepad is very revolutionary, and it really could be something to see in many future consoles. The games developed for the system alone have been clearly well prepared and make use of the second screen really well. I tried a few games such as Trine 2 and New Super Mario Bros. U and was really blown away by the possibilities and features, though I had seen much gameplay footage before actually testing the system. As a gamer, I’m not a freak for graphical prowess but it’s good to see Nintendo finally coping with its competitors on that side too.

Windows 8

Microsoft had huge exposure in the fair (what a surprise) with XBox 360, Windows Phone 8 and maybe most importantly: Windows 8. I’m not really sure about my thoughts about the system, I haven’t been able to get my hands on devices that run Windows 8. I did see it in action, and it seemed visually pretty cool, but it sure didn’t look like a traditional Windows OS. I have read some devastating reports concerning the common interest in Windows 8, and it seems like people mainly prefer Windows 7 to Windows 8.It clearly has some rather good innovations, but I hope Microsoft hasn’t given too much thought on the graphical side and left the usability side in a minor role. On the other side, the OS could be really nice, but it might have the same destiny as Windows Vista had; a visually cool system, but… It sucked. We’ll see in the future, I surely will continue using Xubuntu/Ubuntu and Windows 7, at least as long as Windows 8 will be stable and proven really worth it.

Game Development

I was really happy to see so many stands concerning Finnish game design. I actually got to meet one of the developers of Trine 2 and chat a moment with him about the today and future of game design. Many Finnish colleges and polytechnics had nice information about game designing and were promoting their very own training programs focusing only on game design. Now that’s neat, I had no idea that they’re having actual training programs for that! That is really promising, since there is much hidden talent in Finland in that area. They’re also getting really good funding on the topic.

DigiExpo in a nutshell

DigiExpo wasn’t the only fair on that weekend. I also visited SkiExpo, HifiExpo and BoardExpo with the same ticket, and was really not happy with the fairs. The effect of the economical recession could be seen pretty well, since the other expos felt really limited. They had good innovations and products as well, but the amount of them was little.

There were some nice things going on also with Android and iOS, cameras and such but they didn’t have a huge impact on me to be honest.

DigiExpo instead didn’t have the effect. It seems that people living in economical recession still want to and have to invest in electronics and IT. And that’s a good thing, since the IT business is the business of the future, and I’m really happy to be a part of it by studying it.

For more info about the next DigiExpo in 2013, go to:

http://web.finnexpo.fi/Sites1/DigiExpo/en/Pages/default.aspx

Tagged , , , , , , ,