Tag Archives: web

How to make Plex media server available from remote without having an account through apache

Hello there. 

Plex Media Server

I use Plex (plex.tv). It’s a fantastic tool to manage your videos and movies. You stream anything to anything with a modern browser.  The thing is, if you want to be able to share your stuff through the Internet and see a film you’ve got at a friends house, you need to create an online account on the Plex home page. I’m not to fond of creating accounts sharing what I have. 

There are several ways you can reach your plex meda server from the outside, but in this post we’re gonna use apache (you need to install it if you don’t already have it installed).  In my home network I have my own web server (if you are reading this you’re on it right now ;).  Also on the same network I have got my own proper Ubuntu server with Plex installed.

How do we connect the two services? Apache has got this neat thing called ProxyPass and ProxyPassReversed. So what I did was the following:

  • Create a subdomain
  • Create a .conf file for apache (see below):
  • sudo vim /etc/apache2/sites-available/subdomain.jima.cat.conf
  • Then “cd /etc/apache2/sites-enabled” and make a link:
  • sudo ln -s ../sites-available/subdomain.jima.cat.conf subdomain.jima.cat.conf
  • Restart apache: sudo service apache2 restart

This is the .conf file. You need to change all marked in bold to your own values. The part in blue is only needed if you would like some sort of protection. This makes it impossible for anyone to enter your site without a proper username and password. You can see how to create the password file here.

<VirtualHost *:80>
   ServerName "subdomain.domainname.com:80"
   ServerAdmin "admin@domainname.com"
   ProxyPass / http://192.168.2.51:32400/
   ProxyPassReverse / http://192.168.2.51:32400/

   CustomLog /var/www/vhosts/domainname.com/logs/subdomain.access_log common
   ErrorLog "/var/www/vhosts/domainname.com/logs/subdomain.error_log"

   ProxyPreserveHost on
   <Proxy *>
      AuthType Basic
      AuthName "password protected..."
      AuthUserFile "/var/www/vhosts/domainname.com/.htpasswd"
      Require valid-user
   </Proxy>

</VirtualHost>

Now try from the outside: http://subdomain.domainname.com/web

Cheers
/jima

Disallow robots from indexing your site – robots.txt

About “robots.txt“. robots.txt When you got a folder or a complete site even that you don’t want to be indexed and searchable on on Google or Bing, you can easily do this by creating a robots.txt file and put it at the top (root) folder of your site. When a robot enters your site it will first of all read this simple text file and index only what you want it to index. Though a warning, you have to keep in mind, that this will only be respected by good robots. There are people making their own robots to find mails, images, addresses or whatever that would not even read it. Anyway, some robot stuff: Disallow ALL:

User-agent: *
Disallow: /

Allow ALL:

User-agent: *
Disallow:

Exclude some folders for ALL robots:

User-agent: *
Disallow: /documents.html
Disallow: /misc.php
Disallow: /cgi-bin/ 
Disallow: /private/

Allow only Google:

User-agent: Googlebot
Disallow:

User-agent: *
Disallow: /

 Allow the bingbot access to the private section:

User-agent: *
Disallow: /private/

User-agent: bingbot
Disallow:

 Well, there you go, a few useful robots.txt examples… Cheers /jima