Tag Archives: web

Add StartSSL (StartCom) certificates to your Apache website (and postfix)

Today we’re gonna add a SSL certificate from StartSSL. They have completely free certificates that will have to be renewed every year.  There is not much to it really when you know what to do, but when you don’t, it’s a hassle.
This is what I came up with to make it work:

Enter StartSSL and follow their guide. It’s quite straight forward. In this example we are going to create a certificate for ssl.jima.cat.

StartSSL will at one point ask you to create a private key for your certificate with openssl. The command will ask you some questions and when it asks for your FQDN you set it (in this example) to ssl.jima.cat.

openssl req -newkey rsa:2048 -keyout private.key -out ssl.jima.csr

This will create that private key for you. Though this commands will force you to add a password phrase to your certificate. This will make Apache to ask you to type in the password every time you restart your web server. If you don’t want this (I don’t) you type the very same command as above adding the command -nodes

openssl req -nodes -newkey rsa:2048 -keyout private.key -out ssl.jima.csr

You will have an output like this and two new files:

Generating a 2048 bit RSA private key ..................................................................+++ .....................................................................................................................................+++
writing new private key to 'private.key'
-----
You are about to be asked to enter information that will be incorporated into your certificate request.
What you are about to enter is what is called a Distinguished Name or a DN.
There are quite a few fields but you can leave some blank
For some fields there will be a default value,
If you enter '.', the field will be left blank.
-----
Country Name (2 letter code) [AU]:ES
State or Province Name (full name) [Some-State]:Barcelona
Locality Name (eg, city) []:Barcelona
Organization Name (eg, company) [Internet Widgits Pty Ltd]:jima
Organizational Unit Name (eg, section) []:
Common Name (e.g. server FQDN or YOUR name) []: ssl.jima.cat
Email Address []:
 
Please enter the following 'extra' attributes
to be sent with your certificate request
A challenge password []:
An optional company name []:
 

Now you can copy and paste the content of your .csr file and generate your certificate on the StartSSL site. Download the ssl.jima.cat.zip and .pem files.

Extract the content of the ssl.jima.cat.zip. Then extract the ApacheServer.zip and remove all zip files (rm *.zip) as they are not needed any more (unless you need them for other stuff than your apache server).

You need to merge these two certificates into one file or you will have problems with certain browsers not recognizing nor accepting your certificate (like Chrome on your Android phone).

cat 1_root_bundle.crt 2_mail.jima.cat.crt > 1_and_2_merged.crt

Now you have everything you need to get secure with your Apache server. Edit your httpd.conf file and add the following:

<VirtualHost *:443 >
   ServerName "ssl.jima.cat" 
 
   SSLEngine on
   SSLProtocol all -SSLv2
   SSLCipherSuite ALL:!ADH:!EXPORT:!SSLv2:RC4+RSA:+HIGH:+MEDIUM
   SSLCertificateFile /etc/ssl/ssl.jima.cat.pem
   SSLCertificateKeyFile /etc/ssl/private.key
   SSLCertificateChainFile /etc/ssl/1_and_2_merged.crt
 
</VirtualHost >

 

NOTE:
You can use the same files in order to create a valid certificate for PostFix as well:

cat private.key ssl.jima.cat.pem 2_mail.jima.cat.crt 1_root_bundle.crt >postfix.cert

 

That’s all for today folks.
/jima

How to make Plex media server available from remote without having an account through apache

Hello there. 

Plex Media Server

I use Plex (plex.tv). It’s a fantastic tool to manage your videos and movies. You stream anything to anything with a modern browser.  The thing is, if you want to be able to share your stuff through the Internet and see a film you’ve got at a friends house, you need to create an online account on the Plex home page. I’m not to fond of creating accounts sharing what I have. 

There are several ways you can reach your plex meda server from the outside, but in this post we’re gonna use apache (you need to install it if you don’t already have it installed).  In my home network I have my own web server (if you are reading this you’re on it right now ;).  Also on the same network I have got my own proper Ubuntu server with Plex installed.

How do we connect the two services? Apache has got this neat thing called ProxyPass and ProxyPassReversed. So what I did was the following:

  • Create a subdomain
  • Create a .conf file for apache (see below):
  • sudo vim /etc/apache2/sites-available/subdomain.jima.cat.conf
  • Then “cd /etc/apache2/sites-enabled” and make a link:
  • sudo ln -s ../sites-available/subdomain.jima.cat.conf subdomain.jima.cat.conf
  • Restart apache: sudo service apache2 restart

This is the .conf file. You need to change all marked in bold to your own values. The part in blue is only needed if you would like some sort of protection. This makes it impossible for anyone to enter your site without a proper username and password. You can see how to create the password file here.

<VirtualHost *:80>
   ServerName "subdomain.domainname.com:80"
   ServerAdmin "admin@domainname.com"
   ProxyPass / http://192.168.2.51:32400/
   ProxyPassReverse / http://192.168.2.51:32400/

   CustomLog /var/www/vhosts/domainname.com/logs/subdomain.access_log common
   ErrorLog "/var/www/vhosts/domainname.com/logs/subdomain.error_log"

   ProxyPreserveHost on
   <Proxy *>
      AuthType Basic
      AuthName "password protected..."
      AuthUserFile "/var/www/vhosts/domainname.com/.htpasswd"
      Require valid-user
   </Proxy>

</VirtualHost>

Now try from the outside: http://subdomain.domainname.com/web

Cheers
/jima

Disallow robots from indexing your site – robots.txt

About “robots.txt“. robots.txt When you got a folder or a complete site even that you don’t want to be indexed and searchable on on Google or Bing, you can easily do this by creating a robots.txt file and put it at the top (root) folder of your site. When a robot enters your site it will first of all read this simple text file and index only what you want it to index. Though a warning, you have to keep in mind, that this will only be respected by good robots. There are people making their own robots to find mails, images, addresses or whatever that would not even read it. Anyway, some robot stuff: Disallow ALL:

User-agent: *
Disallow: /

Allow ALL:

User-agent: *
Disallow:

Exclude some folders for ALL robots:

User-agent: *
Disallow: /documents.html
Disallow: /misc.php
Disallow: /cgi-bin/ 
Disallow: /private/

Allow only Google:

User-agent: Googlebot
Disallow:

User-agent: *
Disallow: /

 Allow the bingbot access to the private section:

User-agent: *
Disallow: /private/

User-agent: bingbot
Disallow:

 Well, there you go, a few useful robots.txt examples… Cheers /jima