Archive for the ‘Linux’ Category

openmediavault further steps…

Some time passed since I installed openmediavault on my new small server.

I was just thinking that it would have been simple to work on, but the web interface seemed to freeze on several configuration command. I thought I was using a bad UI, but a simple search on the forum lead me to a worst case scenario: one of the drive was really failing in my server.

As I mentioned in my previous post, something weird had probably happened during transport, as I found an unpinned heatsink. But maybe also the disk had its faults (I hope it gained those too during the transport).

So what now? I’m planning to replace the faulty disk with a new spare disk as soon as possible, to see if every problem goes away and I will be able to fully test openmediavault.

Let’s see…

Advertisements

First experiments with OpenMediaVault

In August I wrote about my new gadget that I bought on eBay.

It’s basically a 1U micro server with two non hot-swappable 1TB disks, 8GB of RAM and an AMD Opteron quad-core 64bits CPU.

Well it arrived and I started doing some experiments on it. At first I wanted to install NAS4Free, a FreeNAS derived distro focused in storage management. My intent was to install it on a USB key, to leave the two disks available for data storage. But it wasn’t so easy to install: installer refused to go beyond network card initialization, so I thought that my new used server was faulty. So I used a Knoppix bootable media (a DVD version written onto a different USB key) to give a look on the system and test RAM with memtest86. It seemed all ok, so I tried another time with the installer, but even in this case with no luck.

Then I tried the original FreeNAS distro, assuming that it was better developed (it’s now a commercial product), but even that didn’t work. Googling around there are some reports on AMD Opteron CPU and FreeBSD, the distro on which both the NAS distro are built on. As I’m not familiar with the *BSD environment as I am on Linux, I took a totally different direction.

On my favorite channel on YouTube, MyPlayhouse, I came in contact with Xpenology distribution. This is basically a way to install a Synology proprietary operating system on a PC hardware. This is a experimental setup, that will give you the opportunity to test a really good built and Linux based NAS-like distro. I’m not going deep on that installation here, there is a lot of documentation on the web about it; I just let you know that it worked pretty well for some time, so I was convinced that my hardware was not fault, … well not at most.
In fact when I was building up a RAID 1 array the first time, one of the disk was inhibited to be used; the reason? SMART reported that the sector reallocation count was too high. Ouch! I tried to test it but system freezed several times, so I gave up and turned all off, and bought some 1TB used replacement drives. When I was to replace the defective drive, I thought I would have to give it one more try, and all errors disappeared… Why? Maybe during transport some cable were loosened, and when I touched all of the stuff inside I put it to work again.

When I was digging in the inside, I noticed a passive cooler was hanged to the mainboard only with one plastic pin; the other one with it’s spring disappeared… Maybe some of the reported errors was given by overheating of the controller chip. I then replaced the missing pin, but first put some silicone grease under the heatsink, and tried again to install something…

My last choice was OpenMediaVault, another free and Linux based distro. It can be installed from an USB key onto another USB key, and it’s pretty stable, full of features and widely used.

It’s installation process was pretty straightforward, but as I didn’t have more time to spend in front of the monitor, I launched it and went to work. When I got back home, monitor was off, no signal, and server was powered down. Tried again on the following day, but is sounded worst: after some time the monitor started flickering: VGA signal started going away randomly, even in BIOS screen, so I thought that my server time was arrived…

I tried the last time to install OMV from the beginning, this time waiting in front of the screen… And this time it all worked!

Now I have my OMV system installed and running, I just have to perform the disk partitioning and start using it.

Windows Server 2008 R2 backup (wbadmin)

It’s a long time since I dont’ write… Many things happened and many more coming..

This is just a small note to remember a couple of interesting links related to Windows Server 2008 R2 backup utility (wbadmin) troubles you may have. This blog article is a good starting point for having an idea on the problem.

If you try to backup your data to a network location hosted on a Linux-based system (but I think this problem can be found also elsewhere), wbadmin could fail with the following error:

The requested operation could not be completed
due to a file system limitation.

This happens only if you try to backup only some directories and not the entire disk(s) you have. Why this problem? This is related in how Linux manage sparse files, required for such a backup. This link explains the problem in details and a way to solve it, supposing you can modify Samba configuration on target machine. If you cannot modify your configuration, you have to backup the entire disk instead of single directories.

More information on sparse files can be found in the documentation of fsutil command.

rsync to non standard port

It’s a long time since I don’t write on my blog… so just a simple post to help me remember a non standard rsync command syntax.

rsync is a powerful synchronization tool, that could help to perform a fast-and-secure backup of files and folder to a remote server. Usually rsync uses ssh to connect to the remote server, thus transfers are secure. But rsync relies on standard ssh configuration to connect to that server. What happens if you changed, for instance,  ssh daemon listen port?

rsync has a command line option to specify what kind of remote shell you would use to connect to the server. This way you can specify the non standard port to connect to. Say you want to transfer all *.tar.gz files from local directory to the remote server myremoteserver, using ssh shell connecting to port 1234 with user user. The command line you have to use is the following:

rsync --progress -vrae 'ssh -p 1234' *.tar.gz user@myremoteserver:/path/to/remote/directory

The only pitfall on this command is you’ll be asked for a connection password, so you cannot schedule this command to be executed automatically. But I know there is a way to logon automatically, providing the local system with a certificate to access the remote system. I’ll write more on this in the next day.