More from: ssh

Backup using rsync with non-standard ssh port

locked-computer-cartoon*nix systems have rsync to back up or synchronize (mirror) their files to a backup computer. For example I can back up the files on my home computer to my office computer. rsync does not copy files that have not changed. The syntax is something like

rsync -avzhe ssh /home/mydir/ mylogin@myserver.org:/home/mydir/ 

The -a is archive, same as for cp. -v is verbose so you can see each file in process and how long until it is finished. the -e lets you specify an alternate communications protocol, in my case ssh.

Note: SSH must be working on both systems for rsync to work using SSH. Also note OpenSSH can be unreasonable and inobvious about permits — the target login directory (example: /home/mylogin) must NOT be writable by group or other. Mostly this will not be a problem — chmod 06755 /home/mylogin will work. BUT also note the /home/mylogin/.ssh folder MUST be 0700 (or possibly 0744) and the /home/mylogin/.ssh/authorized_keys file must be 0700. Otherwise SSH simply returns “Permission denied (publickey).” and refuses to connect. Yeah, someone didn’t think that one through all the way.
Note: you can pull / push files around a few at a time without checking dates and such by using scp. It is like the copy command, cp, but works through ssh. Format example:

scp mylogin@mycomain:/myfilename .

I had a couple problems when I tried rsync initially:

  1. It wiped out my ssh credentials (2048 bit key in /home/mydir/.ssh/authorized_keys) on the remote system by copying the .ssh/authorized_keys in my local system right over the top of it — probably not a good idea to copy your hidden folders up to the remote system.
  2. I use non-standard port numbers for ssh to make hacking me a little more interesting and I found nothing obvious in the docs about how to do it.

I solved these problems in the little script below and also email myself a report when it is done. Note the app I used to send the email is “sendemail” with an “e” in the middle, not “sendmail”: I removed mailutils because I do not run mail servers at this time and that makes it a bit more interesting to hijack my systems for spamming since there is no app to send the spam — remove all programs you don’t need to reduce vulnerabilities. The sendemail program can be installed from the repositories.

#!/bin/bash
PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin
DEBIAN_FRONTEND=noninteractive
#
NOW=$(date +"%Y%m%d-%H%M%S")
LOGME="/home/mydir/log/rsync-$NOW.log"
LOGDIR="/home/mydir/log"

if [ -d "$LOGDIR" ]
then
echo "Log folder located at $LOGDIR"
else
echo "Creating log folder at $LOGDIR"
mkdir $LOGDIR
fi

echo ===
echo $HOSTNAME batch job nightly mirror to my backup server $NOW
echo ===

rsync -avzhe 'ssh -p2222' --progress --exclude='\.*' /home/mydir/ mylogin@myserver.org:/home/mydir/ >$LOGME

sendemail \
-f $HOSTNAME@myserver.org \
-t receiver@mydomain.com \
-u "$HOSTNAME Nightly Mirroring Report" \
-s my-mail-server.net:port# \
-xu "sender@mydomain.com" \
-xp "my-password" \
-o message-file=$LOGME



Ramblings of a mad Admin

john-headset

Mad as in crazy. No one in my family suffers from insanity: mostly we enjoy it. It’s hereditary, you know, parents catch it from their children.

I spent the first six weeks of this year sick in bed, which eliminated my chance to do annual maintenance on my network. Recently I received email informing me that my network was being used as a base for brute force attacks and port scans. After verifying from my firewall logs that SSH was leaving my network just as the other sysop said, I started tracking down the culprit computer. The long and short of it was that several of the Microsoft systems in my network did have the usual viruses, which I removed. Yes, they had Microsoft’s Security Essentials anti virus stuff installed, but that is not enough: they get re-infected daily in normal use. My desktop had just been reformatted and upgraded to Mint 14 the month before and it still showed clean. And the CentOS file server that I installed seven years ago and had just become Internet facing was now dirty. Very dirty.

I removed 79 viruses from the file server and realized that what I probably needed to do was the upgrade that I had intended to do four years ago. Scant resources of time and money continuously frustrate my efforts to be optimal 8). I also found more interesting things on the file server that I will share with you here in case you encounter a similar situation. The most likely way this machine was compromised was that I had been asked to make it available via Internet so that staff could work remotely, I had complied quickly at the end of the day, and in my haste I made some serious mistakes: this machine had been installed several years prior and my assumptions about its configuration were wrong. ASSume makes an ass out of U & Me. Amazingly, there were no SSH log entries on the file server showing the hackers use of the system.

I assumed that I had Deny Hosts on this machine, but I did not already have DenyHosts installed. DenyHosts bans an IP (by making an entry in /etc/hosts.deny — yes, I know that’s old skool but it works nicely) after a specified number of failed attempts to login. This by itself alone would be a lethal mistake, as it would have left the box vulnerable to brute force password attempts all night and eventually they would succeed, since we live in a continuous fog of hack attempts — brute force attempts try to log in as root with every conceivable password beginning with ‘password’, ‘123456’, and moving right along until something works. Normally remote login as root is not a problem for my systems because I have already banned ‘root’ as a remote login: you must login using another account and then shell to root. HOWEVER, making another lethal error, I did not recognize that this distribution allowed root login by default. I found and corrected this error, but too late.

In retrospect, having a system that runs seven years without doing anything much to it is good compared to for-profit server OSes which require daily busywork and reboots, but most *nix admins really won’t want to associate with me after hearing I’m this much of a hack. So don’t tell them. Seven years ago, when I configured this system, I knew less than I know today.

Looking further into things, I decided to add a nightly task that scanned and updated every computer in the premises every night — the computers are programmed in BIOS to turn on with RTC alarm and then they will boot to Linux. A cron job then runs some safe time after that, updates the system, scans and removes any malware, and turns the machine back off. I decided to add it as a cron job owned by root and used crontab to add it, but somehow I also looked at /etc/cron.d, /etc/cron.daily, and so forth. There was a really interesting script in the /etc/cron.daily directory that collected the ssh access logs, tarred them to a temp file, deleted the SSH logs, and emailed the temp file to a @gmail.com address. It also ran port scans. Ah. Culprit found, and this explained why I could not see anything in SSH log files. This script was not detected as a hack by clamscan. I deleted the script. Should have saved it, I know.

What clamscan did find was four root kits. Oddly, I could not delete these four files even as root. I mounted the disks in a new Linux box and examined them (booting from another drive, not the infected drives). These four files could not be deleted in this configuration either, even as root.

Now, I am not used to being told as root that I cannot do something. Usually I am told that I was obeyed and allowed later to discover that the system did what I asked it to do rather than what I wanted it to do. But here is the bottom line: *nix files do have owners and permits, which are controlled with chown and chmod. HOWEVER, they also have “attributes” which can be viewed with lsattr and controlled with chattr. These attributes can make a file “append only”, “immutable” and more, so even root cannot delete the file.

Of course by the time I found even the first part of this I realized that there would be more hidden than I would discover, and so I would need to archive the old disk drives in the safe and install a new file server on new drives (seven years is long enough to recommend an upgrade to new drives).

So here is a summary of SOME of the things I have learned to do in preparing a system for an Internet facing job. Most of them are not new to me now, but I share them to help:

1. Use an up to date distro, remove all packages not needed for the machine’s core purpose. “apt-get update“, apt-get dist-upgrade“, “clamscan -r –remove –infected /” nightly. After being hacked, you must use different account logins and passwords — assume all prior login information has been reverse engineered and published on the Internet. It probably has.

1.a. Automate daily update and upgrades, and scans. Put the scan logs into an unusual location to complicate hacking automation. Insert the cron job to automate virus scanning and updates with “crontab -e” and follow the instructions in the file. IF you still have any Microsoft systems on your network, realize that they will have viruses and other malware installed daily in normal use. You must scan all such nightly and remove the viruses, running from a non-Microsoft partition. The best configuration would be to have uniform hardware in diskless workstations with network boot from virtual machines which can be more easily maintained and protected.

1.b. Set the firewall to allow only those ports absolutely needed for the machine’s specific purpose.

2. Install DenyHosts before the Internet can see the machine. Set it to email you reports. You can then keep tabs on how many hacks you are getting daily. If that drops all at once, you probably need to look into why. apt-get update, apt-get install denyhosts.

3. Be sure the /etc/ssh/sshd_config file has “PermitRootLogin no“. If possible, also change port to something other than port 22. Port scanners will still find it, but make it interesting for them. I have no idea why the powers that be would ever set PermitRootLogin as yes by default.

4. Try to avoid installing samba. All *nix boxes, Mac and Linux, can easily connect with SSH. If all access is through SSH2 then you can focus on SSH security. Samba can introduce vulnerabilities.

5. Build it on a Virtual Machine if you have adequate hardware — you can restore a compromised VM from a good backup by merely copying the files from backup.

6. Segment your network. We break ours (now) into multiple sections according to employee job function. Some of our internal subnets have only a single computer to allow work-from-home access for high level (financial, security, etc) workers without granting any access to these areas for others. Wireless can also be separated out from wired and different access points programmed on physically separate wiring for staff and public accessibility. Only one of our segments has Windows computers — the “public” segment. Windows is hacked daily and our nightly scans (hopefully) are killing all the new bugs each day. Once a machine is infected it can explore your network, but by segmenting your public area physically — not just by using different subnets: make physically separate electrical wiring, firewalls, and servers — you make the task of infecting your critical infrastructure a little more interesting.

7. Don’t assume your workers are too lazy, too stupid, or too foolish to honor sensible security steps. Educate your people on why they need a password that is at least eight characters long. Share why the wireless passwords change every so often. Tell them why there is a staff wireless and a public wireless and they are not to tell anyone the staff wireless password. Explain why they are not allowed to cruse the net on that special PC in finance. Share why Windows XP is not allowed in the facility and why all notebooks must have approved, up to date, anti-virus software working on them, yes even on Macs. If all else fails tell them “That’s just how we roll.” For some reason that works when all else fails.

If you have more ideas to contribute on this topic, please feel free to comment. I read each comment before it is approved for public visibility so please be patient. Registration is required to comment. I do not otherwise use, sell, share, or divulge registered user’s information, however I certainly understand if you do not want to bother. But you please understand that I do this to control what is posted, because experience has shown that it is best to do so.


EdUbuntu 12.04: Configuring to get work done

Last night we installed the new Ubuntu 12.04 Precise on our HP 6910p business class notebook from a USB memory stick. The installation went about as quick and painless as any we have done. We also installed Blue Fish, FileZilla, Chromium, Compiz configuration manager, and SSH Server, and did some configuring of the Launcher bar.

Today we paid attention to the details with some fine tuning so we could get down to work. All in all it went splendidly. Far better than we anticipated — we probably were unduly cynical when the Unity Desktop was introduced last year, due mostly to the absence of choice at that time. There is a lesson in that. With Microsoft preparing the very same scenario at the introduction of their Windows 8 later this year, the Microsoft stockholders could be spared some loss by using an appropriate approach to the customer: providing choice to try something new rather than issuing a unilateral dictation that we like it or lump it.

We installed the EdUbuntu edition for two reasons, first we provide training to a diverse demographic in our lab with ages from kindergarten up to 105 years old. EdUbuntu has a lot of nice K-12 learning material in it for students and for teachers. Also, EdUbuntu comes with the Gnome desktop in case we don’t want Unity. As it turns out, we are using Unity now because the compiz desktop Zoom comes closer to working in Unity than in Gnome.

From last night, first we did a “sudo passwd root” to set the root password to something we can control, then we ran

# apt-get install chromium-browser
# apt-get install compizconfig-settings-manager
# apt-get install openssh-server
# apt-get dist-upgrade

We found FileZilla, Blue Fish and VLC Media Player in the “Ubuntu Software Center” (the shopping bag on the Unity Launcher) so we installed them from there. Then we compressed our .thunderbird and .fillezilla folders on our desktop in preparation for copying them to our laptop. We had a bit of a run around looking for the “Connect to Server” menu item, but eventually a tip lead us to open the FILE menu (top of screen) for the “Home Folder” and that allowed us to connect to our desktop with SSH and copy various files we use. We extracted both the .thunderbird and .filezilla archives to our desktop and test ThunderBird and FileZilla: all our mail and web site logins were correct and ready for us to begin work.

As you know, the way you choose which desktop you want is by clicking the little dot in the upper right corner of the password box at login time. You pick Unity or Gnome and then enter your password. We started with Gnome, moved the bar to the bottom of the screen, changed its height to 35 px, turned on the Show Hide Buttons, and added the launchers we use to the gnome task bar — you must hold down the ALT key while you right click on the task bar to get the pop up menu and click properties to configure the bar but you can drag and drop icons from the menus to the task bar as usual. We also changed the WorkPlace Switcher to use two rows and hold eight workplaces.

We also programmed the compiz Enhanced Zoom Desktop

Zoom in: Button4
Zoom out: Button5
Zoom box:Button2

Sadly, it didn’t work in Gnome…. But it did sort of work in Unity: the desktop itself does zoom, but the launcher bar does not. Of course it didn’t before either. So we are currently using Unity. It is nice to have a choice.

To configure the launcher in Unity is easy enough also, but a bit different than Gnome. To change the position of an icon on the Launcher you grab it and drag it up or down to where you want it. To add a program to the Unity Launcher, you can drag the icon right off the Dash Home (search results) onto the Launcher or run the program from Dash Home which causes an icon representing the program to appear on the Launcher at the bottom. Either way, then right click that icon and click “Lock to Launcher”. To remove an icon from the Launcher you right click the icon and then click “Unlock from Launcher”. We added Chromium, Blue Fish, FileZilla, and Text Editor to our Launcher as we use those daily, and removed FireFox and the Ubuntu Software Store from the Launcher to save space: we can always run them using the Dash Home (round red circle at the top of the launcher) when we need it.

When we opened the Chromium Browser to add it to the Launcher it asked if it could sync with Google to restore our browser settings. We said yes. It worked wonderfully. Nice to not need to retype all our bookmarks or figure out all our web site passwords. It does this for Android in our Nook Color and HP TouchPad also, but the form factors there are different and it was not quite as nice.

The real change for us is that the Launcher stayed put! It didn’t play the hide and seek game on us — jumping around as we opened or closed windows. This greatly enhanced the usability of the Unity desktop because we no longer had to fight with the Launcher to get it out of our way — the prior behavior often had the launcher jumping out from the side to cover up parts of the left side of the window that we were trying to use, especially when we were trying to click on mail folders in Thunderbird.

Configuring the Unity Launcher is also better now: Open the System Settings (bottom of Launcher, a Wrench over a Gear) and choose the first icon, Appearance. The size of the icons on the launcher is now controllable at the bottom of the window: we set ours to 35 px. There are also two tabs at the top of this screen — “Look” and “Behavior”. We never noticed “Behavior” before. But this is where you can turn the AutoHide feature for the Unity Launcher ON or OFF. You can also specify where to point to unhide the launcher and choose how sensitive it will be to your pointing. This simple control makes a huge difference in usability.

The thing we always forget to do until we get sufficiently annoyed is to change the time delay for the Grub boot loader. The ten seconds default setting is not enough — in the lab for our students who must read the boot screen to figure it out, or for us as we become distracted waiting for the system to reboot. Sixty seconds works much better for us.

To set the GRUB delay, do not edit the /boot/grub/grub.conf directly — the next time your system receives an updated kernel your changes will revert back to the default. Instead go to the /etc/default folder and edit the file named grub: for example by typing “nano grub”. Change the grub_timeout parameter to a number bigger than ten. I used 90 today: I can always press ENTER to get booted: we do not need to wait for the timeout: but when we are busy, having a wider time frame in which to specify which system we want booted seems to help. After you edit grub then run update-grub to change the /boot/grub/grub.conf file.

The next item we adjusted in the System Settings was the Brightness and Lock. We let the screen dim when unused, but not turn off into locking mode — that just annoys us. We also went to the Power Settings and adjusted the When Lid is Closed options so the notebook Does Nothing when the lid is closed. If we start a long job on the notebook, or plug it into a keyboard and monitor, we often close the lid. The default is to put the notebook to sleep when the lid is closed, which is not what works best for us.

The last thing we did there was to install a printer. And that is about a wrap for this posting. We installed EdUbuntu 12.04 on our desktop too, and it went about the same as our install on this notebook. We added the FireWall Configuration, VirtualBox, Samba, the Arduino IDE (bonus!!!), Eagle (PCB Design) We did try encryption on our home folder on the desktop, and we will report how that works later.