More from: Technical

Severe Vulnerability in All Wi-Fi Devices

This entry was posted in General Security on October 16, 2017 by Mark Maunder   22 Replies

There is apparently a major Wi-Fi vulnerability that affects all devices using Wi-Fi. The vulnerability allows attackers to decrypt WPA2 connections.

You can read more on the WordFence blog here:

The “Data Center” of the Future

A visual model of the Data Center of the Future: Coffee Maker insides after years of use.

I just read through Network World’sHow a data center works, today and tomorrow” (see

They feel ‘The future of data centers will rely on cloud, hyperconverged infrastructure and more powerful components’.

I partly agree, and partly disagree.

The IT business cycle is well known: IT starts as a centralized department, becomes a bottleneck, other departments set up their own IT for operational survival, the uncoordinated small IT becomes unmanageable, policy swings back to standardizing and centralizing IT, and the cycle repeats. So “Data Centers” will be centralized, then distributed, then centralized again … likely forever at the corporate level.

The controlling force will not be commercial dominance: no one company will succeed at becoming the global “Data Center Hegemon” – grass roots, open source, widely varied people driven interests will take over IN SPITE of corporate attempts to “own” the Data Center scene. The “Data Centers” inside large organizations will be a tiny part of the planetary Data Center.

Photo of a Nest Thermostat in The Bond Building. 20 June 2013, 11:54:32 by Amanitamano

I also disagree that the Data Center of the future will be composed mostly of more powerful things, rather I feel that it will be made up of far less powerful things, redundant, error correcting, in massive numbers, using cooperative computing protocols, to become a massive unified computing power. As each cell in a human body is little by itself, coordinated together all the cells form a much more significant and powerful organism: an organism that can loose many cells, survive, heal, and grow. No single corporation, or corporate alliance, can approach this potential because of management, legal, contractual, and financial encumbrances. The Data Center’s life blood is network connectivity and its future body will be shaped accordingly.

All technologies must eventually inter-operate, and those which do not will be relegated to irrelevance, but most of this will be from non-corporate innovation, not for profit initiatives. There will likely be government attempts initiated by corporate influence to eradicate all “unauthorized” software on some pretense: any software not sold by “authorized” programmers, such as that created by programmers not under corporate control and released for the public good without mandated government “back doors” or for profit motives may even be criminalized. Public software will not only survive but it will grow and the attempts to destroy it will drive it underground, improve it, increase its sophistication, and make it harder, not easier, to oppose.

And yes, I believe some large corporations will contribute to the process, which will eventually be overwhelmed and confiscated by massive grass roots factors, gently, slowly, imperceptibly until it is too late. The surviving Corporations will be the ones which recognize this from the start and design to work together with rather than oppose global communications.

There will always be “free public cloud” ( that is, network based file and app servers), sometimes  bootlegged inside ‘secure’ corporate systems, but there will be much more storage a kilobyte at a time from mundane and ignored things such as Mom’s pacemaker or Uncle Joe’s radio all coordinated by Harriot’s thermostat. IoT device security and control will improve accordingly. Remember BitCoin. In the future, my FitBit may be harboring 1% of your favorite vacation picture for you – but don’t worry, if I upgrade my watch Harry’s fish locator and Mary’s microwave have redundant copies just in case.

The idea that people in mass will keep their most private data on a server owned by some for profit entity that will turn everything over to secret government agencies or marketers at a whim is unworkable until people have absolute confidence that their private data will remain absolutely private NO MATTER WHAT. This can never happen with any ownership of centralized “Cloud” services because government can and will seize those centralized computers if they think it necessary.

Reading, gaming, sleeping… All in Kyiv subway by teteria sonnna from Obukhiv, Ukraine

There will probably be significant human influencers wearing rags and living in dilapidated buildings or on the streets as well as those wearing jeans or tuxedos and living in middle-class homes or skyscrapers. BOT nets will no longer be merely for mafia profits, ransomware,  and spam generators but will be a means to suborn “secure” private networks or effect communications kept temporarily private from “official” corporate or government eyes.

The “Data Center” of the future will not be one place but every place. It will be connected by multiple redundant means to circumvent corporate power to use government to silence profit syphoning opposition. It will not look like a ‘Max Headroom’ dystopia but free open source software will be critical in its reliable operation even though specific corporate proprietary software will also be present.

And the one thing we can count on is that it will be constantly changing all of the time. How can for profit corporate interests survive or thrive in this new world? Easy, simply make your corporation indispensable to the victors.

Exhaust Ports

Picture of Chrysanthemum

júhuā (Chinese: 菊花) [Chrysanthemum]

There seems to be an upswing of hack attempts from one country in particular – oh, other “3rd world” nations half-heartedly try to hack, but one place in particular dishonors their ancestors with the clumsy incompetent behaviour of fools – trying to break into the homes of others who have done nothing to harm them. They disfigure their already disgraced face by attacking little Public Charities who have no money and unselfishly help old people learn skills and get jobs.

There is a nice little way to deal with those “júhuā” through IP Rules (blocking via .htaccess merely block web browser access, not for example FTP or SSH).

If you have cPanel  hosting, don’t really care if certain foreign countries have any access to your web sites, and have noticed an upswing in SQL injection or other hack attempts, you might consider using the IP Blocker functionality to deny access to all from any related IP ranges. (and several others that come from the same place) can no longer access my sites. Blocking a range of IP addresses can be more effective than blocking just one IP address because the hackers typically switch addresses after hacking to get around blocking, but they normally must use another address from the same group.  I will watch for other fools and block them when they appear also.

Conky Weather

Changes in formats for available weather information recently required me to revise my conky script. Conky can be installed from the Ubuntu / MINT repos or downloaded from github and documentation is on sourceforge or by the man page. This information is furnished AS IS – use it at your own risk. Most recent version of the GPL applies.

The following code is how I get my weather information directly from and display up to date weather information for my area. The code for surface weather observations is found on the Weather.Gov web page for your zip code and the state / county codes for weather alerts is at the bottom of the page here.  My code is KFWA and my county is INC003. Replace these with the ones appropriate for your area.

The recommended pull time for your area is in the xml file that you get from Here is a sample of part of the lines in that file showing the suggested pick time and frequency:

<?xml version="1.0" encoding="ISO-8859-1"?>
<?xml version="1.0" encoding="ISO-8859-1"?>
<?xml-stylesheet href="latest_ob.xsl" type="text/xsl"?>
<current_observation version="1.0"
xmlns:xsi="" xsi:noNamespaceSchemaLocation="">
<credit>NOAA's National Weather Service</credit>
<title>NOAA's National Weather Service</title>
<suggested_pickup>15 minutes after the hour</suggested_pickup>
<location>Fort Wayne International Airport, IN</location>

The part of my .conkyrc script dealing with weather is as follows:

${color FFAA00}WEATHER ${hr 2}$color${font Verdana:size=9}
#${execi 3600 curl -s > wwraw.txt}
#${execi 3600 wget -q --output-document="walerts.xml"}
${color 00FF00}${exec cat wwraw.xml | sed -n '//p' | cut -d'>' -f2 | cut -d'<' -f1}${color}
${color 00FF00}${exec cat wwraw.xml | sed -n '//p' | cut -d'>' -f2 | cut -d'<' -f1}${color}
${color 888888}Wind: ${exec cat wwraw.xml | sed -n '//p' | cut -d'>' -f2 | cut -d'<' -f1}${color}
${color FFFF00}Humidity: ${exec cat wwraw.xml | sed -n '//p' | cut -d'>' -f2 | cut -d'<' -f1}% ${color}
${color 00FF00}Temperature: ${exec cat wwraw.xml | sed -n '//p' | cut -d'>' -f2 | cut -d'<' -f1}${color}
${color 888888}Dew Point: ${exec cat wwraw.xml | sed -n '//p' | cut -d'>' -f2 | cut -d'<' -f1}${color}
${color 888888}Pressure: ${exec cat wwraw.xml | sed -n '//p' | cut -d'>' -f2 | cut -d'<' -f1} in Hg${color}${font}
${font Verdana:style=bold:size=10}Warnings:${font}${color}
${exec cat walerts.xml | sed -n '/\/p' | cut -d'>' -f2 | cut -d'<' -f1}

Notice the two lines which I have commented out with the pound # sign. These fetch the surface weather observations and weather alerts from They are commented out in my conky script because I fetch them instead using cron, 15 minutes after each hour,  the time recommended by They don’t HAVE to be in the script but I leave them there in case I need to refer back to them at some future date, such as if I accidently delete my weather fetching bash script. Edit your crontab in terminal by typing “crontab -e”.

# m h dom mon dow command
15 * * * * ./

The .xml file provided by says what time is recommended to pull. I try to avoid pinging more frequently than needed as a matter of respect: the weather is updated once per hour so more frequent pulls achieve nothing useful and cost in bandwidth.

To pull the weather information I use a bash script as follows. Remember a bash script must be “executable” for your login. Search on linux permits if you need more information. I can let cron call this once per hour, and it is also called through my .bash_profile when I log in.

#pull weather from

wget -q --output-document="wwraw.xml"
wget -q --output-document="walerts.xml"

To have the script prevent excessive downloads, I use this complete bash script:

#pull weather from
#ONLY IF it has not been pulled recently
#weather is updated once per hour
# recommended time to pull is hour+15
# manually noted weather is updates about
# 54 minutes after the hour
#last_update: 20170826 JDN

# define minimum time lapse before new pull is permitted
declare -i MINAGE=22
echo 'Min age' $MINAGE

# calculate time lapsed since last pull
if [ -f wwraw.xml ];
declare -i LAPSE=$(( ( $(date +%s) - $(stat wwraw.xml -c %Y)) / 60 ))
echo 'Lapse' $LAPSE
declare -i LAPSE=999

# allow override of command line parameter #1 is --force
if [ -n "$1" ]; then declare -i FORCE=1; else declare -i FORCE=0; fi
echo force is $FORCE

# calculate if pull is permitted (boolean)
echo dopull is $DOPULL

# test time lapse vs. minimum time lapse to allow pull or override
# Check if file older
if [[ $(($DOPULL + $FORCE)) -gt 0 ]]; then
# debugging message output if pull was performed
echo "File was pulled $LAPSE minutes ago. Pulling new weather data"
# get surface observations
# file will have date / time from
wget -q --output-document="wwraw.xml"
# mark file with current time to prevent hammering
touch wwraw.xml
# get weather alerts
wget -q --output-document="walerts.xml"
# debugging message output if pull was not performed
echo "File was pulled $LAPSE minutes ago. Ignoring request to pull again."

unset FORCE
unset DOPULL
unset MINAGE
unset LAPSE
exit 0

How to configure Windows 10 to allow sign ons using credentials from Office 365

In my lab I have several computers which are used by many different community members, from job trainees in formal training at my agency to students and business persons just stopping by for a cup ‘o joe and to check their email. It is impractical to give every person their own account on every computer. However I wanted each student to be able to have their own file space and community members to work on their resumes and such without a need for special treatment.

Now, thanks to changes Microsoft provided in Windows 10, people can Readmore..