Installing Mediawiki on a Raspberry PI3

We’ve all been there, and by all I mean ‘Technicians’, and ‘there’ I mean at job sites. And you have no idea what the password is, and your people in the office ‘may have’ the password to something you need that was written down some time ago.

Fortunately I have a effective solution that could be of some major use to you.

Well, it is 2x things. A small SBC (single board computer) called a Raspberry Pi 3 and a piece of software called ‘MediaWiki’. This is the same programme that runs ‘Wikipedia’. The site that allows anyone to edit, add, modify, correct info on the world’s largest encyclopedia site.

 

And with it running on a small SBC, you can leave it inside a network cabinet, as long as it is attached to some power and you should be fine.

For this project you will need the following:
  1. A Raspberry Pi 3 with a power supply and a 16 gigabyte SD-CARD with Noobs already configured on it. (See here on how it is done.) You might be able to get away with a 8gb card but I will recommend 16gb as a minimum.
  2. An internet connection.
  3. A monitor that you can be able to plug the PI3 into temporarily (at least until remote access is enabled.
  4. If you are running Windows on your pc that you will be using to remotely connect to the PI3 you will need 2 utilities. Putty and PSCP. Both are available @ http://www.putty.org/
  5. Knowledge of how to follow tutorials.

 

This tutorial will assume that you already have a pre-configured SDcard with the latest Raspbian version on it. So just just get it connected to the network via a RJ45 cable and power it on @ first.

ENABLE SSH (FOR REMOTE LOGIN)

If remote access via SSH is not enabled, you can do so via the terminal command ‘sudo raspi-config’, Selecting  ‘Interfacing Options’, selecting ‘SSH’ and selecting option to enable it.

This will also be a good time to get the IP address of the PI3 (this can be done using a network scanner looking for the network name of the PI3 <default is ‘raspberrypi’>, looking for it in your DHCP configuration on your server or router, or just by the terminal command ifconfig

Mine is currently showing that it is 192.168.1.81.

Connecting to the Pi3 via SSH via Windows (PUTTY)

If you are using Windows (Linux command will follow) , this is done through Putty. Once you have downloaded and installed putty, you will need to open it up

Putty screenshot

In the field underneath ‘Host Name (or IP address) you put in your raspberry pi’s ip address, and then click connect. If may ask  you to confirm any security keys or encryption keys. Please click on yes until you see another window with it asking for a username. The default username is ‘pi’. You will need to type it in and press enter. It now should be asking for a password (if you logged in with the username as ‘pi’ the default password is raspberry).

SSH COMMAND IN LINUX TERMINAL

If you are running Linux (any distribution as it doesn’t matter) you can open up a terminal and run the following command:

ssh pi@(IP_ADDRESS_OF_PI)

If you are using a different username for the pi please substitute it. Then put in the ip address of you PI3, then press enter.

If you get this after running the command, type in ‘yes’, press enter then put in the password and press enter again.

Now we can start getting the software downloaded.

Lets first download Mediawiki. The version we are going to be installing is 1.30 as this is the latest version.

Type in the following commands in order:

cd Downloads

(if the Downloads folder doesn’t exist first go ‘mkdir Downloads)

wget https://releases.wikimedia.org/mediawiki/1.30/mediawiki-1.30.0.tar.gz
tar -xvzf mediawiki-1.30.0.tar.gz
sudo mkdir /var/lib/mediawiki
sudo mv mediawiki-*/* /var/lib/mediawiki
cd /var/www/html
sudo ln -s /var/lib/mediawiki mediawiki

Now to do the installation and updating of the software on your PI3

sudo apt-get update
sudo apt-get upgrade

Once done, run the following command the base packages required for Mediawiki.

sudo apt-get install apache2 mysql-server php php-mysql libapache2-mod-php php-xml php-mbstring

If the packages have already been installed, the command will skip those and install the others.

CONFIGURATION OF MYSQL

During the installation of MySQL server, you will be prompted to enter a password for the MySQL root account. If for some reason no prompt is given, please run the command below.

 mysqladmin -u root password "enter the new password here"

Remember not to use same password for system root and MYSQL root.

We’re going to create a database for our MediaWiki installation to use which will be named ‘wiki’. You can use any other name you wish.

MySQL comes with a shell that can be used for configuration. We can use the the MySQL shell to create a database:

$ sudo mysql -u root -p
Enter password:
mysql> CREATE DATABASE wiki;
mysql> USE wiki;

The ‘USE temps’ command tells the shell to use that database in future operations in this shell session. MySQL supports users accounts, so we need to create a database user and give it access to the database:

mysql> CREATE USER 'wiki'@'localhost' IDENTIFIED BY 'password';
mysql> GRANT ALL PRIVILEGES ON wiki.* TO 'wiki'@'localhost'; 
mysql> FLUSH PRIVILEGES;
mysql> quit

This creates a user called wiki , and assigns it a pass word ‘password’. This user is allowed to connect to the database from ‘localhost’.

Initially, the new user has no privileges, so it must be granted some access rights using the ‘GRANT’ command. I have used ‘ALL’ in this example, but in real world applications it would be better to grant users more limited rights. A complete list of privilege options is available at http://dev.mysql.com/doc/refman/5.1/en/grant.html.

CONFIGURATION OF MEDIAWIKI

Now you should be able on the pc you are connecting to the pi from configure Mediawiki at last! Open a new browser tab or window and browse to the following:

http://ip_address_of_pi3/mediawiki

If you are getting a page that says ‘LocalSettings.php not found’ then click on the ‘setup the wiki first’. If you do not get that page, then you skipped a step somewhere and would need to start over.

The next few screens you will need to go over everything carefully. You can leave all of the settings as their defaults. But when you get to the database connection settings, use the same settings as in the creation of the MYSQL database which was done in the MYSQL shell moments earlier.

ON the page where it asks for a name for the wiki, you can name the wiki whatever you want it to be. For testing purposes I will call it temporary wiki. It will also ask for a username and password (last time it is asking for username’s and passwords I promise). You can even put in the email address so that it can send out emails for security purposes (and how to do that will be in a follow-up tutorial).

When you are on the last screen, click on ‘Continue’ and then leave it until another page loads asking to download the file named ‘LocalSettings.php’ save this file somewhere you can access it.

You will now need to get this file onto your PI3.

If you are running Windows:

Browse to where you saved the file ‘pscp.exe’ (Downloaded in Step 1) . In the white open space HOLD your left shift button, and RIGHT CLICK at the same time. An option to ‘open the command prompt window here’ will open. Then if you downloaded the ‘LocalSettings.php’ file to the same place then you can run the following command but if not you will need to either copy the file to where pscp.exe is or modify the command appropriately.

pscp LocalSettings.php pi@(IPADDRESS_OF_PI):/home/pi/Downloads

In a Linux terminal, browse to where you saved the ‘LocalSettings.php’ file and run the following command.

scp LocalSettings.php pi@(PI_IP_ADDRESS):/home/pi/Downloads

Once the file is there you can move it from the place we copied it to the correct place. This is done via the remote terminal with the command:

 sudo mv ~/Downloads/LocalSettings.php /var/lib/mediawiki/

Now you can go back to the same address on the same pc that we started the Mediawiki configuration (http://ip_address_of_pi3/mediawiki) and you will see the first page.

To change anything, this will be done in that ‘LocalSettings.php’ file that we moved to the Pi. I will be adding some tutorials regarding this wiki (including daily backup of the database, etc) which will come.

 

In the meantime please comment if I had missed anything or need some clarification I will get back to it when I can.

 

Free Linux+ course

… but cannot afford to take a course in the subject.

tux

Why am I doing this?

A very simple reason. I like Linux. I have always had a yearning to help bring more people the knowledge of what Linux can do. Also I had a work colleague that said to me that he is going to report me. I asked why and he said ‘because you have illegal software on your PC’. Granted this person is in his 60’s and only been working with PC’s in the last 10 or so years. But that is because of the marketing power of Microsoft and Windows. Couple that with just about every OEM built PC and laptop has Windows pre-installed on it.

I have learnt a lot from the Linux community over the many years that I have been apart of it. I want to ‘Pay It Forward’ helping others learn about this great system. So just like in the 2000 Kevin Spacey film of the same name, I aim to do the exact same.

How long will it take for me to complete the course?

I will do my utmost best to have each of the 4 modules up on this site as soon as each module is ready. As I add each each module I to the site I will create a link on this page to help motivate me to complete this task. Also once I get a semi-decent microphone (current one probably sounds like a cat being skinned when I speak into it) I will do a video version of these tutorials, and post the links to those on each page.

If you want to know how long each module will take for you to complete, then I honestly cannot say as I am definitely not you. It all depends on how willing you are to learn.

 

Now there are 4 modules that I will be splitting this course into. They are:

 

  1. Introduction to Linux – This is where I give a introduction and a short history of Linux.
  2. Installation of Linux – I will guide you step by step in the installation of a Linux distribution explaining each step in great detail (partitions, file systems in use, mount points, etc)
  3. System administration – Here I will show you the many ways of how to install a programme (whether available through the software repositories or not), how to add local users, create files, directories and some basic troubleshooting commands.
  4. Preparation for the Exam – This is probably self-explanatory, even though I will not be the one administering the exam.

 

Review – OpenRA

System requirements:

Shader Model 2.0 capable GPU (Radeon 9500+ / GeForce FX5200+ / GMA X3100+)
1GB Free Hard Drive Space 
512MB RAM (the game itself uses ~100MB)
Sound
3 Button Mouse
Network Card (desirable)
Display capable of at least 1024×768

(so basically any pc made in the last 10 years)

Review – OpenRA

There many games developed during the late 80’s and early 90’s that became cultural icons in their own right. They became the yard-sticks from which future games in that genre would be measured by. Super Mario Bro’s for Platformers. Doom for FPS games. Ultima for RPG’s. Myst for adventure games.

In 1993 a developer called ‘Westwood Studios’ had released one of the most ambitious game titles in the history at that point. A game set in a alternate Earth but modern times, telling a story from both sides of a fictional war. They used all the tricks they could to tell a compelling story from both sides of a war, complete with hammy acting in the video cut-scenes that would play before each mission.

I am talking about Command & Conquer. After it found very quick success Westwood had pushed out a expansion pack the following year, ‘The Covert Operations’ in 1996. In 1996, they released the first spin-off game (which was planned as another add-on to the original game) called ‘Red Alert’. This game took what worked in the original and improved on it. Better art, sound, everything.

The following year the released not one but 2 expansion packs, ‘Counterstrike’ and ‘The Aftermath’. Adding all this goodness to an already great game.

Then in 1998, Westwood Studios went back to one of their original successful games and remade it. Dune 2000 was released and although it used the same engine as Red Alert, it did not fair very well critically with IGN and gamespot giving 5/10 each.

All these games are among my favourite games of all time, and I have bought every edition of these games that I could find but past few years (since I made the switch to Linux)I could not enjoy them as much in a emulated environment(Dosbox and virtualbox) but I was surprised when I found out about this OpenRA whilst browsing around Google when I had heard that Command & Conquer and Red Alert (including the addon packs) became freeware and I was looking for a way to get these games running in Linux again.

I was surprised when I not only managed to find this program but also saw that it was open-source. Updated to use the hardware acceleration of modern video cards using OpenGL and cross-platform positional sound built upon OpenAL. The people behind this not only managed to recreate the engine powering up these games. They also allow you to download the assets that would be needed to play the game from a server (if you do not have the cd or the freely available ISO files previously downloaded).

Gameplay.

While I (and possibly you) love the classic RTS gameplay, multiplayer game design has evolved significantly since the early 1990’s. The OpenRA mods include new features and gameplay improvements that bring them into the modern era:

1. A choice between “right click” and classic “left click” control schemes

2. Overhauled sidebar interfaces for managing production

3. Support for game replays (during single and multiplayer) with an observer mode designed for online streaming

4. The ‘fog of war’ that obscures the battlefield outside of the line-of-sight of your controlled units/buildings

5. Civilian buildings that can be captured and used for strategic purposes.

6. Units gain experience as they fight, improving with each rank.

Therefore these are classic games with modern standards that work in each ‘mod’.

Plot/Story.

With the game being focused more on the multiplayer. There is a large single player component with many of the original missions being recreated in this engine.

But each of the 3 mods have separate stories:

Dune 2000

openra-dune2k

Three great houses fight for the precious spice melange.
He who controls the spice controls the universe!
Establish a foothold on the desert planet Arrakis, where your biggest threat is the environment.

Command & Conquer – Tiberian Dawn

openra-cnc

An alliance of nations fights to protect Europe and northern Africa from a mysterious terrorist organization and the valuable but toxic alien mineral, Tiberium, that is slowly spreading over the world.

Command & Conquer – Red Alert

openra-ra

In a world where the Third Reich never existed, the Soviet Union seeks power over all of Europe. Allied against this Evil Empire, the free world faces a Cold War turned hot.

Conclusion

While this is an amazing piece of work that is slowly improving all the time, I cannot wait until they implement their next feature into the game. Support for Command & Conquer 2 – Tiberian Sun!

I will happily rate this ‘game’ an extremely well-deserved 85/100.

Mad Max coming to Linux & Mac PC’s 20 October 2016

Another good game coming to my favourite platform (Linux that is).

To put a different spin on a line from the latest Mad Max movie – ‘What a game, what a lovely game’. Now I cannot wait to get it running on my Linux powered PC.

For the 6 or maybe 7 people that do not know who ‘Mad Max’ is I will tell you. He is a fictional character. He’s a survivalist in a series of 4 movies (not to mention a comic book and 2 games which I the 1990 release only this one based on the 2015 release). All directed by the great George Miller and starred Mel Gibson as ‘Mad’ Max Rockatansky in 3 of them. Tom Hardy replaced him in the 2015 reboot.

Below is footage of some of the game play from last years E3.

Now I cannot wait for the 20th. I hope that there will be more great games coming for Linux in the future.

Listed below are the official system requirements for Mad Max from Feral Interactive.

OS – SteamOs or Ubuntu 16.04 or newer

Processor – Intel Core i5 3.4 ghz or AMD FX8350 3.4ghz or better.

RAM – 8gb

Graphics – A nVidia Geforce GTX 660 TI 2gb graphics card or better with proprietary driver version 367.35 (AMD cards will not be supported) or newer.

About 32 GB of storage on your hard drive.

For Macintosh:

OS – Mac OS X 10.11.6

Processor – Intel 3.2 GHz

RAM – 8GB

Hard Disk – 35GB

Graphics – 2GB

Below is a statement from ‘Edwin’ who apparently works at Feral Interactive on why AMD cards are not supported.

Due to various issues with Mesa we will not be supporting AMD on release. We only can officially support drivers if they meet the quality needed and sadly Mesa isn’t quite there yet as it has a few issues and edge cases. It was a super close call but in the end we won’t say a game is officially supported unless it runs great using release versions of the drivers and kernel.

However we developed the game using AMD cards as well as Nvidia so if you install the latest Mesa beta drivers and other updates as needed it runs it’s just not at a level where we can provide official support as some edge cases will exist. Hopefully once all the Mesa developments get into a stable release version and all the related kernel improvements also are in a stable release then support for Mad Max and similar complex titles will be easier/possible.

Summary: AMD is unsupported meaning we don’t recommend you purchase it on AMD GPUs, however we have don a lot of work towards support which should happen once all the improvements to Mesa/Kernel are in stable branches.

That post can be found over  here.

As long as Feral brings the goods with a good quality port of a great game you cannot go wrong.