Tuesday, May 16, 2017

The most basic protections

If you haven’t done so since details about the WannaCry ransomware attack started dominating the news cycle, go right now and verify that all your computers have their current software update. That’s not just the computer you’re sitting at, but the rest of your family’s computers, your office mates’, and especially the 10-year-old computer in the spare room that you use to download pictures off the old video camera.

Start with any updates for your operating system. Microsoft sends updates the second Tuesday of every month and occasionally a special update in between. These automatic updates frequently require an irritating computer reboot that comes just as you’re completing a critical project. Search for “Windows update” from the Windows search bar in or near the Start button to verify you're up-to-date. Do not use web search as those may include ads that may give you malicious results. Always install all important updates and any Microsoft Office, Defender, or Security Essentials updates that apply to you (you don’t need to install language packs or other unusual accessories).

Now check that your other software is up-to-date, starting with your web browsers and document viewers. Many programs include a “check for updates” link under the Help menu. Unfortunately, few notify you or install updates automatically. Some may even want to charge for an update or new version.

Other details

If you leave your computer running all the time the Windows and antimalware updates will usually be installed automatically including automatically rebooting. But still verify the installation monthly.

Although they may not be susceptible to this attack, don’t forget about the computers in your purse or pocket. Apple is pretty reliable at getting the latest software to i-devices as soon as it’s available. Android users aren’t as lucky since updates have to be mediated through Android, the device manufacturer, and then the carrier before they get to you. Apps may get updated frequently or never and can have less-than-desirable actions even when functioning as intended.

Many devices that users don’t think about as “computers” also need frequent updates. If you have a computer professional, they should be aware of the risks posed by equipment such as routers and WiFi. At home you may find that equipment such as DVRs, streaming media, security systems, and personal assistants also pose a risk to your personal information or the internet.

Thursday, April 13, 2017

Protecting your data in transit

Data In Transit – Data At Rest

I recently received this question from a user:

Especially given the new anti-privacy laws. Is there a way to encrypt your data to avoid it getting sold to the highest bidder. I already have everything on Google drive, for the most part. It makes it easy since I have so many computers where I do my work and I travel a lot, which increases the likelihood that I lose a laptop or tablet.

Someone mentioned a VPN. I have one for work. Is it worth getting a VPN for personal use to guard my privacy?

Here's my response:

First of all, congratulations on being aware of these issues.

Second question first:
Protecting your data in transit.

The world as of 1/1/17:

When you interact with websites over HTTPS (such as financial, shopping, legal, and more every day), your communications are encrypted both ways between your browser and the remote servers. The encryption is good (and evolves as the attacks grow more capable) such that anyone tapping the communication can’t read your credit card number. This is why some industries such as health care and legal, by their professional ethics rules, can use email only to alert you to go to your account on a secure portal to read any substantive communication.

The risk is if an untrusted party controls a segment of the communication pathway between you and your destination. This “Man In The Middle” can then feed you a bogus certificate that encrypts your data so he can read it as it goes by. The most common scenario for the MITM is to offer public WiFi in a situation that you should be expecting it. He could create his own hotspot named “coffeeshop” or “hotel” sitting at the next table or nearby room and induce you to use it rather than the authentic hotspot.

The world today:

Recent rumblings in Washington imply that any US internet provider (ISP) will be allowed to act as an MITM. Previously they have at least been on their honor to read and record only the information required to pass your communication on its way towards its destination. Now they may track the contents of your communication and sell what they learn about you to whatever market is interested in it. This can be particularly valuable, or noxious, depending on your viewpoint because they already have a lot of personal information about you such as your name, address, telephone, and creditworthiness and can attach that to your browsing details.

Even worse, they could attach to their terms of service that you must install their master certificate to your system so they can even look into your HTTPS communications. Presumably, you could opt out of this tracking for an additional cost.

This is where the VPN comes into play. When you install a VPN on your computer, you originally received their certificate through a reliable channel. By contrast, when you browse to an HTTPS site you receive a certificate on the fly and would have to examine it in detail every time to ensure its validity. Updated browsers will alert you if there seems to be a problem with the cert, but few people understand what the problem might be or how to validate it so they just accept it anyway.

Having made a verified connection to the VPN, you then send your data directly through an encrypted link to the VPN’s connection to the internet whence it continues to its destination. This method is comparable to handing a letter to the agent in the post office rather than clipping it to your door and hoping that the person who picks it up is a trusted mail carrier. (When you use a VPN to your office, the endpoint is the office network and you are able to function as though you were sitting at your desk in the office.)

The Opera browser includes the ability to connect directly to a VPN for all your browsing. (Enable it from the Settings menu in the Privacy & security section. You then turn it on or off and choose the location of the exit point from a button in the address bar.) This VPN only protects your data that is going through the Opera browser. If you use another browser, an email client, or other app such as messaging, file sharing, or media streaming; you are not protected.

To protect all your internet traffic you need to use a VPN that is installed in the operating system like any other program. You may set it to start at your computer’s boot up or turn it on whenever you are away from a trusted internet connection. If you have a company VPN you can probably access the internet through it and not need another installed VPN. (Be aware, though, that the company VPN, especially from a company computer, means they are a trusted MITM if you use it for personal communications. Even if they don’t decrypt all of your traffic [which is the case frequently to protect their computers and network from malware], they are still seeing your metadata such as that a large file was transmitted to their competitor.)

Using a VPN may impose a degradation of your communication speed or latency. This would be most noticeable when transferring large files or with real-time applications such as gaming, voice or video chat, or remote computing. Such issues should be less significant with a paid service. The only installed VPN I’m familiar with, which came highly recommended, is proXPN at https://proxpn.com.

Aren’t you glad I answered the easy question first?

Next comes …
Protecting your data at rest.

Monday, October 24, 2016

Protecting your data at rest

Data In Transit – Data At Rest

I recently received this question from a user:

Especially given the new anti-privacy laws. Is there a way to encrypt your data to avoid it getting sold to the highest bidder. I already have everything on Google drive, for the most part. It makes it easy since I have so many computers where I do my work and I travel a lot, which increases the likelihood that I lose a laptop or tablet.

Here's my response:

First of all, congratulations on being aware of these issues.

Protecting data at rest is not a matter of one or two simple responses: 

On your computer you may have financial and medical records, password lists, personal emails, and a decade of browsing history. While legitimate internet communication shouldn’t expose static data, your disc drive is a prime target of malware. You have installed “set and forget” technical protection in the form of antimalware software and think you’re protected. Modern operating systems are largely hardened already and user best practices are even more important. Once you click on a link, you’ve given whatever is attached to it permission to do whatever it might. Everyone who sits at the computer must develop the reflex to ask why are they opening an attachment or visiting a website and what are the risks?

Now you can trust that your data are safe once you turn off the computer and lock the door to your office. But that computer is a laptop sitting on the seat next to you on the train or in the coffee shop. Maybe your data aren’t even on the computer but conveniently shared and available “in the cloud.” Either way, some stranger may be able to walk by and pick it up from you.  

How do you protect this?

The answer is that your files should be encrypted whenever they are not in use. Unlike your HTTPS communications, this encryption is something that you must take responsibility for. It’s a nuisance, but it means every time you open a project or share a document you must use a password and appropriate procedures.

Fortunately this need not require entering a unique password constantly. Probably most of the files you handle daily don’t really need to be strongly protected against snooping. Most pictures and emails, even if they’re not public, may not represent a significant privacy or financial risk.

For what does need to be protected, files can be encrypted either individually or in bulk. Modern office suites offer an option to password protect a document as you save it. Compression utilities (“zip”) also can encrypt the files as they’re stored. Their encryption methods are now solid; unlike the password option in Microsoft Office 2003 (.doc files rather than the current .docx format) which could be opened without difficulty if you used another brand of editor.

For larger quantities of files you can use an encryption system like VeraCrypt to create an encrypted virtual disc or even to encrypt your entire computer. If you choose the virtual disc option; it creates a single file that, when you open it appears to the system like any other drive. When it’s closed the contents appear as total gibberish to anyone without the key. The encrypted file can be stored or transmitted without fear of loss of your data. While it can be stored in a shared cloud, it must be synchronized manually as most systems won’t recognize when it has been changed.

But you want universal access of your data in the cloud.
 Again, weigh the nuisance factor of file or folder encryption with the value of its contents. Most “name-brand” cloud providers probably offer reasonable security by requiring a sign-in to your account. Hopefully they also use encrypted transmission while it's in transit. The bigger risk is when you give a collaborator access to modify a document that is synchronized back to your computer. In that case, you have given someone permission to put any file they want on your computer without your intervention. This could represent the ultimate phishing attack if you’re not alert to it.

What if someone doesn't have to break in to see your data?

If you synchronized individual files, the cloud provider has your data and all the meta details associated with it. Unless you've encrypted the individual files with a password, they also have access to that content. Maybe their terms of service promise they won't actually read the files, how will they react if someone comes in claiming to be "with the government" and asks for your data? If their data center is in the same jurisdiction as you they have to satisfy a subpoena; and may respond to an unjustified request.

You can make your cloud storage secure from this loss by using the same practices you use for data on your own laptop. You would have to download and upload the files every time you use them to ensure the protection is always in force. Collaboration also would be problematic unless you were all working with the shared files in a homogeneous environment such as Microsoft Office365.

Hacked over Russian hackers?


Are you upset that Russian hackers – possibly operating under the influence of, or even directed by, their government – got into the Democratic Party’s email system?

I’m not.

I’m upset that anyone was able to get into the system as easily as they did.

Any high interest operation such as a major election is going to attract the attention of hackers trying to break in for any of a multitude of reasons. Just as Willie Sutton is going to rob banks, political adversaries or those seeking financial gain will take any advantage they can against their opponents.

It is the responsibility of the people with valuable information to protect it themselves. Once an organization reaches a size, a level of notoriety or importance, or economic or political significance; they must take advantage of professional security experience. An individual who gets hacked may have some losses but won’t necessarily suffer serious economic or reputational disaster. A large business may be able to expend the resources to clean up after they’ve learned their lessons. But the entities in the middle, from a 10-person office to a national volunteer organization could be damaged beyond recovery.

What should a high profile organization like a political party do?

If I were consulting them, the first thing I’d do is sequester the devices and accounts from everyone with a recognizable name. Then I would issue them devices that are known free of any malware and without the most attacked apps. These would route all online activity through the office via VPN where it is protected from interception and filtered. Similarly, their email and messaging will go through a single system with advanced safeguards and appropriate passwords. Finally, social networking will all be posted by public relations personnel. Although there can be accounts in the principals’ names and they may submit posts; they will be vetted and edited, if necessary.

Finally, everyone will attend a class in protecting themselves against attacks from phishing to ransomware and all the online lures. This is because a slip of the finger by anyone from the top dog to the intern – and even the IT staff – can open the entire organization to an attack.

Browsers churn disc drives

A researcher discovered that browsers might churn disc drives - to the extent of writing gigabytes of redundant data per day.

Steve Gibson, using Sysinternals tools discovered that the Firefox web browser was rewriting a snapshot of its current contents to the default disc every 15 seconds. If you habitually leave your browser with many tabs open all the time, this could amount to a huge amount of data over the course of the day. Also, if you are leaving tabs open, it's writing the same data every time. (Gibson admits to keeping hundreds of tabs open.)

While writing unnecessary redundant data to the disc may have had a minor impact on overall computer performance a decade ago; this could seriously degrade the life of modern Solid State Drives.

All chip-based memory devices from a $5 flash drive to the industrial-grade system storage in servers can have information written to a given cell a only finite number of times before the reliability starts to deteriorate. Under normal use, the SSD that helps your laptop run cooler and have a longer battery life will probably outlive your desire for a faster computer or larger screen. But there is no need to put this extraordinary stress on the system and reduce its life by possibly as much as half.

SSDs are also appearing in higher-end consumer and business desktop computers or are being retrofitted by hobbyists. End-market devices marketed at a lower price point may be even more prone to early failure under this load. They might have a lower redundancy and not be able to survive as many write cycles as those sold for use in internet servers.

A similar issue of heavy disc usage also exists in Google's Chrome browser. Hopefully publicity will encourage the browser publishers to revise this procedure. Unfortunately, not being a security issue, it probably will not get a high priority for correction.

Gibson has determined a tweak to Firefox that allows the user to reduce the churn that is excerpted at http://bloghd.zaitech.com/extras/BrowsersChurnDisc.pdf. Or listen to the podcast at https://twit.tv/shows/security-now/episodes/582 (you can jump forward to about 1:05).

Creative Commons License. This work by Bill Barnes is licensed under a Creative Commons BY-NC-SA 3.0 US License. Permissions beyond the scope of this license may be available at http://zaitech.com/satellite/contacts.htm.
(cc) 2016- Bill Barnes - Disclaimer - Home Page - Blogs Home

Friday, September 23, 2016

How to steal an election

Please read my article on how difficult it is actually to significantly change the outcome of a major election.

Download it here: http://zaitech.com/downloads/HowToStealAnElection_pub-wm.pdf

Wednesday, September 7, 2016


A spinning hard drive (HDD) is often the greatest source of heat in your computer. My custom-built computer has five (5!) HDDs in the case. While one is a different model, they are all 1 TB drives with similar specs.

I happened to be running with the case open recently and touched one of the drives. It was HOT! After installing Crystal Disk Info (http://crystalmark.info/download/index-e.html), I discovered a couple of my HDDs had internal temperatures of 47° and 59°! (That’s 116°F and 138°F).

I moved one HDD to the empty DVD bay so that none would be sandwiched between two others. Then, with the case open, both showed running temperatures of 44° (111°F). Whether it was adjacent to another or completely in the open, both drives showed the same internal temperatures.

When I put the covers on the case, the temperatures came down another 6° to 38° (100°F). You may think having the case wide open to the air conditioned room would be good for component temperatures. Being enclosed allows the fans to pull outside air over the drives and other critical components, cooling them more efficiently.

While I was at it, I pulled out my wife’s computer which is almost 10 years old – and runs fine. However, when I opened the case the cavity and heat sink fins had an incredible amount of dust. I hit it with the compressor (I can’t afford enough canned air to keep my computers clean) and reconnected the computer after straightening out the spaghetti bowl of cables that built up under her desk.

Monday, September 5, 2016

A useful utility

How many keyboards and screens do you have on your desk?

Here's a utility (skip down) to help tame a tangle, but first, the history.

Many hobbyists, power users, and business people find it necessary to work on more than one computer at a time. Lots of people have multiple monitors, but this applies if you have a complete additional computer and monitor at your workstation.

I have long used a KVM (keyboard-video-mouse switch) to use two computers with a single set of desktop components. In the mid-1990s the keyboard would not reliably switch so I kept a second keyboard connected. Unfortunately, I often forgot to move to the alternate keyboard and would type a command to "computer A" that actually had a deleterious effect on "computer B".

I now have 3 monitors on my desk. My primary computer has dual screens and the third is connected to a secondary computer so I can continue to work while monitoring a process - or watching Netflix.

Start reading again ...

I used to use a KVM to control the secondary computer - ignoring the video component. Then I discovered a free utility from Microsoft Garage. This is a group that thinks up neat stuff and makes it work - at least sorta. But the powers decide it's not commercial or of broad interest and they abandon the project. But they make the program available - without any promises of support, updates, or even that it will function as described.

I'm using Microsoft's Mouse without Borders* to control my secondary computer. It allows the mouse and keyboard to move seamlessly across up to 4 computers, each with their own monitor. Move your mouse and instantly you're controlling a different computer. Slide back and you're on the original. Even the clipboard comes across more smoothly than it does for many remote control programs.

One of its quirks is that it doesn't reliably reconnect after a reboot. You still might need a KVM or extra keyboard for that twice a month that you have to reboot your computers.

Full links are offered so you can examine the URL to ensure there is no hidden misdirection.

Mouse without Borders: https://www.microsoft.com/en-us/download/details.aspx?id=35460 

Creative Commons License. This work by Bill Barnes is licensed under a Creative Commons BY-NC-SA 3.0 US License. Permissions beyond the scope of this license may be available at http://zaitech.com/satellite/contacts.htm.
(cc) 2016- Bill Barnes - Disclaimer - Home Page - Blogs Home

Friday, July 15, 2016

Planning for 2020

Windows 7? ... Windows 8.1? ... Windows 10?
Planning for 2020

Note: these comments may be irrelevant after July 29, 2016.

Are you like me? I'm very happy with Windows 7 which I've been using for 6-8 years and my computer is tweaked just like I like it. This custom-built computer has adequate power for now and is easily upgradable. But  Microsoft is definitely going to kill Win 7 in four years while I hope this computer will still be going strong. At that time, I'll have to upgrade to the newest version of Windows for which Microsoft may want to charge me $249 by then.

By July 29 - I should have started sooner - I will upgrade "this" computer to Windows 10 for free. Then I'll revert and go back to using Win7 until it can't walk any more. However, any time in the future I'll have a free Win10 license ready to run.

There are two ways I could do this "upgrade on new installation" or "upgrade, archive, and revert." I'll use the first method, "upgrade." If you have an OEM Windows without install or restore media, you may have to use the second, more complex method.

METHOD 1 - A clean install

My plan is to install Win7 on a new hard drive in this box and allow it to get upgraded. Since I'm no fan of dual boot - and am not sure I could dual boot the same DVD key - I'll disconnect my current C: drive and repeat the basic process I performed 2 years ago. Once Win 10 is installed, I'll take the new drive out and return to my running machine. Occasionally I'll swap back to Win10 to get updates and verify the installation.

Since this is a generic computer and I have a retail copy of Win7 on DVD, it shouldn't be significantly different from what would happen if I had a drive failure. At this writing, I have installed Win7 on a new drive, but am missing a few drivers. I'm looking into a utility to extract the running drivers from the running installation which happens to be on the same hardware. There's also the issue that a reinstallation of Win7 will require over 200 updates and can take a week to complete. There is a means to shortcut that problem by manually installing just a few updates.

METHOD 2 - Upgrade and revert

If you don't have your original distribution media or find it difficult to temporarily replace your primary boot drive, you will need to upgrade the way Microsoft expects most people to. This will require multiple backups, one or more large capacity external drives, and a lot of interactive patience.

Start with a complete data backup to reliable media. Don't forget any settings and customizations you've made to your applications and your password database. Also backup your email and account details and passwords if not included in your data folders. This protects your data in case something goes terribly wrong.

Then do a full system image of your Win7 boot drive. There are multiple programs that can do this; most of the ones with comprehensible interfaces you will need to pay for. This allows you to get back to where you started if the upgrade and revert processes fail.

Now allow the Win10 upgrade to install and use it for a while so it has a chance to stabilize. After you're comfortable that everything is working and no data or applications have been lost or corrupted, create an image of Windows 10.

Within 30 days of the upgrade you can revert back to your previous operating system. Theoretically you have a perpetual license to reinstall Win10 on this computer at any time in the future - even if you've made minor changes like adding memory or replacing a hard drive. I don't know how either process works or will work. If anything fails, you've still got your image backups to get back to where you started.

Creative Commons License. This work by Bill Barnes is licensed under a Creative Commons BY-NC-SA 3.0 US License. Permissions beyond the scope of this license may be available at http://zaitech.com/satellite/contacts.htm.
(cc) 2016- Bill Barnes - Disclaimer - Home Page - Blogs Home

Sunday, June 5, 2016

A second thought on upgrading to Windows 10


If you seriously want to get Windows 10 for free on your computer, you might want to get started by mid-July, 2016. When I went to upgrade my newest brand-name laptop from its factory-installed Win 8, I had to fight with it for several weeks. Here are things to consider:
  • If you are happily running Windows 7 or 8.1; consider keeping it. Microsoft will continue to support them for another 3-1/2 years and you won't have to worry about missing drivers or other quirks.
  • Will your computer take the upgrade smoothly? In my experience, what Microsoft considers "adequate" hardware has always been very optimistic. It was very happy to install Win10 on my netbook with 1 GB RAM and a 1 GHz Atom CPU. I am telling my clients they need a minimum of 4 GB RAM and a 64-bit multi-core CPU. (2)
  • Is your computer at all old or non-standard? Even if the hardware is capable, your manufacturer may not provide 64-bit or Win10-compatible drivers for components more than 2 or 3 years old. The same goes double for any non-factory components you've added or peripherals like printers or scanners.
  • Perform a full-system image backup to facilitate a roll-back should you have any problems. Even better, clone your hard drive to a new one and upgrade the disc that hasn't already got several years usage on it. Then your old drive is your backup.
  • Get the resources from Microsoft to install Win10 from a DVD or USB; even if you intend to allow the automatic upgrade. (3)
  • Verify you can boot from your external media. I found the Secure Boot feature of new computers would not allow me to do so. These two steps alone took me a week to complete.
  • Back up your data again. (4)
  • Finally say "OK" to the nag you've been getting for months. I recommend you choose the "download now, install later" option to ensure a clean, continuous download. The entire package is 3-6 GB.
Bill Barnes

(1) Share these notes here: http://fromthehelpdesk.blogspot.com/2016/06/a-second-thought-on-upgrading-to.html
(2) Find this information in Control Panel > System. If you have 32-bit Win7, but a new computer; the app at https://www.grc.com/securable.htm will determine your CPU's capability.
(3) https://www.microsoft.com/en-us/software-download/windows10/.
(4) Naturally, I recommend you buy Carbonite backup software from me: http://goo.gl/CXqBsB.

Friday, May 27, 2016

Quotes without comment (Windows 10 edition)

Some stories that were recommended for me to read/view:

On Friday I received:


But on Thursday I had already gotten a link to:


(These screenshots are linked to the documents. Click on them for the “full” story.
Open links are below to verify source. For the safest surfing, read the destination [// domain.com/] and copy the link into your browser.)



Tuesday, May 10, 2016

Microsoft will not call you

Pardon the redundant warning ...

I hope this reminder falls in the same category as “buckle your seatbelt” and just reinforces the diligence you already take to treat every offer from a stranger with a grain of salt. My saying it now was inspired by a warning in a WindowsSecrets (1) that there is a current rash of this type of scam.

Microsoft will not call you offering to fix a problem you didn’t know you had. (Neither will Dell, Google, Facebook, or anyone else.)

If you get an unsolicited call, email, or popup on your screen  referring to some critical issue that you must use their assistance to repair right now – it’s likely to be a scam!

  • Do not click anywhere inside a popup.
  • Do not install anything that you didn’t go looking for.
  • Do not ever give anyone you don’t know access to your computer or your money.

The exception to these rules might be if you can’t open any of your files and the only thing you can see is a message that you need to send some anonymous entity money – usually via Bitcoin. This is a ransomware infection and it is probably real! In this case, immediately unplug your computer and contact your computer professional. Most likely, you are toast. The only solution is to pay up or start over with your backup data. Also, unfortunately, if you delay or attempt to get around this on your own, you run the risk of even corrupting the good backups you do have.

Feel free to share this with all your friends and relatives who have a computer or telephone and use the internet.

Here’s the open link for WindowsSecrets, because you never click to go to unknown websites from a link you might not trust: http://windowssecrets.com/newsletter/better-localcloud-management-for-big-data-sets/

And a couple weeks later Windows Secrets alerts us to a "support" scam directed against Dell owners:
Support scam alert for Dell users: http://windowssecrets.com/field-notes/tech-support-scams-take-a-disturbing-turn/ (note: this is a 2-part article; scroll down past "Windows 10 ..." to read the report on the new scam). 
(2) Which is where I make my pitch for you to buy your Carbonite automatic, online backup service from me:

Creative Commons License. This work by Bill Barnes is licensed under a Creative Commons BY-NC-SA 3.0 US License. Permissions beyond the scope of this license may be available at http://zaitech.com/satellite/contacts.htm.
(cc) 2016- Bill Barnes - Disclaimer - Home Page - Blogs Home

Friday, April 29, 2016

Lost passwords

Lost WiFi passwords

Q. How can I find the WiFi password on my router?

If you know the login information to configure your router, just connect to it as an administrator and go to the Wireless > Security section.(1) The password should display there.

If you have a device that already connects to that router, you may be able to extract the password from it. Windows 7 (and XP) will display the plaintext password under Manage wireless networks in the Network and Sharing section. Some Android devices will also show the plaintext saved password.

If you’ve moved past Windows 7(2) (even as an upgrade), the password is not shown in the interface. It still is available as plaintext if you know where to look in the system. The easiest way to do that is with a utility; which I have recently done.

I usually document my research well, but can’t find exactly what I looked at or why this time. There may be a hint in my caveats, below(3). I thought my original impetus was an article in WindowsSecrets, but can’t find it now. You may be able to search for Key Finders on their site.

I did look at Magical Jelly Bean (https://www.magicaljellybean.com/wifi-password-revealer/) and NirSoft (http://nirsoft.net/password_recovery_tools.html) and eventually used a keyfinder program from Magical Jelly Bean to recover WiFi passwords on a Win10 computer. Both sites had been vetted and recommended … somewhere. A colleague frequently uses Magical Jelly Bean.

The program quickly displayed a list of almost 3 dozen sites I had connected to in the past with this computer with SSID, password, and some technical information. I captured it as a screenshot, blacked out my sites, and printed it to carry with my laptop. Yes, this exposes passwords for many friends and relatives to anyone who steals my bag. But there is no connection between my papers and my friends so all the thief can do is drive around the country looking for the SSID.

It is as important to protect your WiFi password as any other. You may not mind someone using your bandwidth, but anyone connected to your network (either WiFi or wired) could invade any computer on your system – and “computer” includes your phones, game devices, and connected appliances (like a thermostat or light controller) as well. Then any data or settings on them could be vulnerable to attack by stealing the data or malicious destruction. And one of those computers you don’t think of as such is more likely susceptible to becoming a gateway from the outside for bad guys to do even more harm.


Notes and resources:

(1)     If you don't know the login for your router, you can return it to the default settings by pressing a recessed button with a pin. Then you must completely reconfigure all of your settings. Of course, if you don't know the login, you may have never changed the default settings. See my article for tips on critical settings to customize.

(2)     If you’ve got anything with Windows 7 (or XP) that connects with WiFi, you can display the password for each network directly in Windows. With Windows 7, find it at:
Control Panel\Network and Internet\Manage Wireless Networks – Get there from
Network and Sharing Center > Manage wireless networks (on left sidebar) > Security tab

(3)     As always, when researching and downloading non-commercial resources, ALWAYS be careful exactly where you click. (I sometimes use a sacrificial computer* to do my research and downloading.) I have a note with my saved passwords that this program tries to co-install a couple of unrelated programs that will return money to the publisher. For more information on using “free” software, see my post at http://TechnologyInterpreter.info (May 2016).

Creative Commons License. This work by Bill Barnes is licensed under a Creative Commons BY-NC-SA 3.0 US License. Permissions beyond the scope of this license may be available at http://zaitech.com/satellite/contacts.htm.
(cc) 2016- Bill Barnes - Disclaimer - Home Page - Blogs Home

Wednesday, November 11, 2015

Google Fiber Is Coming To Town

Perhaps you’ve heard: Google Fiber selected Charlotte as one of the first nine cities where they will offer their internet and TV service. They have completed their initial surveys, mapped out locations for their networking equipment and begun putting fiber optic cable in the ground. The next step will be running thousands of miles of cable along neighborhood streets to bring service to individual subscribers.

What is Google Fiber?

Google Fiber will be another carrier providing internet and television service to individuals and, presumably, small businesses. As such, they will operate in direct competition with carriers such as Time Warner and AT&T.

Although no specifics are available for Charlotte, service is already available in Austin, Texas. There they offer an option of high speed internet or internet plus TV. Internet only is priced at $70 per month and over 150 channels of TV plus 8-channel recording adds another $60. They also have a basic internet-only service for a one-time installation fee of $300 which can be paid at $25 per month for a year. After the $300 is paid, there are no more charges.

Google’s Gigabit Internet is advertised as “up to1,000 Mbps.” That is approximately 20 times the speed currently advertised by the currently available major providers. Their prices for “up to 50 Mbps” are $35 and $65 for the first year. Google Basic Internet offers “up to 5 Mbps” with no costs after the initial fee. Realistically, most home users with typical usage probably would not notice a significant difference at speeds greater than 25 Mbps. For more information, see my blog post “What’s a gigabit” at http://TechnologyInterpreter.info.

Why should you be interested in Google Fiber?
Telecom and internet providers are notoriously weak in most customer service surveys. The $430 billion Google is known for tackling technological problems with a different viewpoint from traditional players. At the very least, encouraging Google will bring competition to the near monopoly of service currently available, even if you choose not to change your provider.

Furthermore, Charlotte was honored to be chosen by Google immediately after the first three pilot cities. Supporting Google will prove to the rest of the country that Charlotte believes in the 21st century.

What’s next?

Building a completely new infrastructure is a major task and Google will not be able to offer it to the entire city at once. As yet, they have not announced what areas they will start in, but they are collecting addresses to determine what areas show the most interest. Register at https://fiber.google.com/cities/charlotte/ and if enough of your neighbors also do so, you may have another option for internet and TV soon.

NOTE: This item was originally published in The Spirit of Plaza Midwood, Fall 2015.

The writer has no affiliation with any of the businesses mentioned. Google did not respond to a request for specific details. All information presented is from public resources.

Creative Commons License. This work by Bill Barnes is licensed under a Creative Commons BY-NC-SA 3.0 US License. Permissions beyond the scope of this license may be available at http://zaitech.com/satellite/contacts.htm.
(cc) 2015- Bill Barnes - Disclaimer - Home Page - Blogs Home

Wednesday, September 16, 2015

What’s a gigabit?

What's a gigabit?

Short answer: it's a billion bits. Thank you. Y'all hurry back.

Honestly, when most people encounter a gigabit it's in the description of their network or internet speed. Usually written Gbps, it means one billion bits per second, which is equivalent to 1000 Mbps or 1000 million (or Mega) bits per second.

The "per second" identifies the distinction between the quantity of information and the rate at which it can be transmitted. Transmission rate is usually expressed in bits while quantity is counted as bytes. Although a byte (the common unit of file size) is equal to 8 bits, communication overhead means the byte rate is approximately 1/10 the bit rate. Thus, to a first approximation, 1 Gbps might move data at about 100 MBps (note that bits are abbreviated with a lower case “b” while bytes are a capital “B”).

So, how fast is a Gbps?

Internet Service Providers often describe their speed in terms like “download a movie (or song) in x number of seconds.” Except that most people don’t care too much how long it takes to download a file. They want the movie they’re watching right now to play smoothly. With streaming media, they download a movie over two hours. One second of HD streaming video might require 4-5 megabits of data (cutting edge “4K UHD” video might be 25 megabits). Even doubling the numbers for a 100% reserve, that’s a long way from 1000 Mbps.

“Your mileage may vary…”

Despite the sales pitch, a promotion for 1 Gbps might not give 1 Gbps. Just about every internet package is described as Up to xxx Mbps.” This implies they’re not actually guaranteeing any particular minimum speed. There are many factors more or less beyond the ISP’s control that affect the actual internet speed delivered to an end device.

Alert: tech talk coming up. You may want to jump to “What else is on the line?

Obviously, the ISP has no control over the connections or devices inside the house. Until recently most computers shipped with wired network cards that wouldn’t communicate faster than 100 Mbps. While the computers’ connections got faster, upstream devices such as home routers may not have been upgraded. Some older routers were limited to only 10 Mbps input from the internet so internal computers could communicate much faster between themselves than with the internet. Ramping up speed with WiFi technology has come later and at much higher cost than for wired connections. Also, most WiFi specifications share a single total capacity with all devices connected to it so any one device’s speed is limited by what others may be doing, even if they're not using the internet.

Before it gets to the subscriber’s equipment, the ISP’s local modem may not support their maximum speed. Outside the house, there are mitigating factors for some neighborhood transmission technologies. On some systems all subscribers may share a single cable from a local hub. While the cable may be capable of more than the advertised speed, it may not be so if everyone is using it at the same time. Other systems may give a single wire to a single subscriber, but the potential speed falls off with distance from the hub. And at every hub from the user to the ISP’s connection to the internet more users are vying for a finite amount of capacity.

Leaving the ISP does not mean clear sailing for maximum speed. The web page still has to go through up to a couple dozen routers; any of which could be technologically limited, failing, or overloaded and slowing the connection. At the ultimate server the same failings on that end could slow down communications.

Another fly in the mud is the complexity of what’s in the content. With the proliferation of rich web page advertising some pages may contain content from scores of servers all over the world. This material has to be requested individually by the computer, each one going through its own gauntlet mentioned above. Some pages downloaded over excellent connections can take up to 45 seconds to complete even before the streaming services that are the goal of the connection start.

What else is on the line?

If the carrier really delivers a consistent 1 Gbps to the doorstep and all the subscriber equipment is up to the speed, there’s more overhead that could nibble away at the best rate. Services such as telephone or security systems may be constantly consuming capacity. Computers and other devices (is the refrigerator talking to the supermarket yet?) may not be friendly about when they request their updates (corporate internet services have been slowed to a crawl on days that smart phones got an update – without anyone requesting or expecting it). Online data backup and synchronization services need to move large amounts of data and they don’t want to wait until overnight in case it’s needed before then.

But the real bandwidth hogs are the subscription services many assume are separate from the internet connection. Streaming media – audio, and especially, video delivered on demand – consume capacity immediately and continuously. And, the higher quality delivered, the more capacity needed.

Television, even traditional television channels, is decreasingly being viewed via broadcast – one signal delivered to every viewer. Instead, the “tuner” is located at the provider’s offices and each subscriber receives a dedicated stream of the program; even if everyone on the block is watching the same football game. “Digital recording” works the same way with all the recordings and stop points stored in a database and generated from a central server on demand.

How much is enough?

Visualize 3 televisions or computers in the house in use at once. Add another stream for each simultaneous channel a “traditional,” on premise DVR is recording. That’s 5 Mbps each. Audio streaming is about one-half Mbps per channel. Online gaming is indeterminate, but allow 1 Mbps because it demands immediate response. Add up to a couple more Mbps for incidental services, mail, and web surfing as these demands are typically intermittent.

Total everything up and double it for reserve and future demands to get a conservative number of need. Now match need to affordability. Currently, depending on package and promotion, 50 Mbps may cost $35-$120. If that cost is too high, most streaming services will automatically adjust their quality to the available bandwidth. Most people probably won’t notice the first couple steps back from ultimate quality in most cases. Also, ask the provider if the extra cost TV package consumes bandwidth already billed for. Then ask them again and write down their name.

If the cost for 50 Mbps is not excessive, then consider the upgrade. In some areas 1 Gbps is only $20 more than 50 Mbps. In most areas, 1 Gbps may be promoted but is actually pie-in-the-sky.

And still more gotchas.

While packages may be sold as 50 Mbps or 1 Gbps, these are download speeds. Most residential plans only offer 1 Mbps upload. Web surfing and media streaming have minimal and intermittent upload demands so 1 Mbps is sufficient. Online backups, synchronization, and media sharing may take longer to complete, but are rarely timely. However, more consumers are using two-way audio and video communications which may quickly saturate this capacity; especially if they originate a conference call. Unfortunately, greater upload speeds are mostly available only with business class packages, which are often much more expensive.

All of these discussions have only been concerned with rate of connection with no mention of total quantity of data moved. While most US wired ISPs have not (yet) started metering quantity, most cellular plans do. Cellular may offer up to 25 Mbps, a continuous download at that rate will burn through a 2 GB (gigabyte) plan in 10 minutes. A standard DVD movie (not Blu-ray) runs about 4 GB or an hour of good quality audio is 30 MB.

Mega - Giga, etc.:     https://en.wikipedia.org/wiki/Gigabit

Creative Commons License. This work by Bill Barnes is licensed under a Creative Commons BY-NC-SA 3.0 US License. Permissions beyond the scope of this license may be available at http://zaitech.com/satellite/contacts.htm.
(cc) 2014- Bill Barnes - Disclaimer - Home Page - Blogs Home

Thursday, July 2, 2015

Is LastPass Hacked?

In the middle of June 2015, the password manager LastPass sent a message to their users announcing that their internal security had been breached and a some tens of thousands of records from one of their databases had been stolen.

Yeah, that’s technical PR talk for “We been hacked!”

Does this mean LastPass is worthless? Should you stop using it? Should you change your password?

Answers: No, No, and Maybe.

If your LP master password was weak, you definitely should change it. And if you used your LP master password anywhere else, you need to change every other site you used it.

A “weak” password is anything that looks like it might have come from a dictionary of any major language, including char@ct3r substitutions or random capitaliZation. A strong password should be at least 15-20 characters long, truly random, and include all four character types.

You can get a quick evaluation of how good your password might be at https://www.grc.com/haystack.htm. For randomness without any unconscious human prejudices, use a good password generator such as several available at grc.com or the one built into LastPass.

For more technical details on this topic, read on here.

What did LastPass lose?
Apparently records were stolen for a small number of their subscribers from a server containing user names, a hash of the user passwords, and the per-user salt used to create the hash.

A hash ensures that bad guys can’t just log in somewhere with the information they stole but have to decrypt your actual password from what they have. The fact that LastPass has a per-user salt prevents them from brute-forcing a dictionary once and comparing the results to their whole take. Instead, they have to individually brute-force (try every possible character combination) each user because the same password for multiple users will result in a different hash.

And now they have access to my account?
Now they can start attacking one person’s account, except that LastPass threw them another delaying tactic. Instead of hashing your password once, or 500 times; they hash it 100,000 times before they save it. This requires anyone trying to test the password they guessed against the hash they stole to spend microseconds on each try rather than picoseconds. Even with specialized computers, they can only test a few thousand possible passwords per second.

Thousands of passwords per second! I’m toast!”

Not necessarily. A simple 6-character password like aaa&1B has 750 billion possible combinations. At 100,000 guesses per second, it could take over 40 days to come up with a match. And that match allows them to break into one account. They have no way of knowing whether the account BoyObama will give them nuclear codes or a teenager’s Twitter account.

Since you have one 12-character password out of half-a-septillion combinations it could take seven times the age of the universe to crack.

How many combinations:     https://www.grc.com/haystack.htm
And the number is called:     https://en.wikipedia.org/wiki/Metric_prefix

Creative Commons License. This work by Bill Barnes is licensed under a Creative Commons BY-NC-SA 3.0 US License. Permissions beyond the scope of this license may be available at http://zaitech.com/satellite/contacts.htm.
(cc) 2014- Bill Barnes - Disclaimer - Home Page - Blogs Home

Wednesday, July 1, 2015

Data for the ages

While publishing an obituary for a long-time member in a club newsletter, several people mentioned that he had a regular column in the newsletter years before. We thought it would be nice to share his representative writing with those members who remembered it as well as those who never met him.

During much of that period I was editor of that auspicious publication. I knew I had drafts of most monthly issues. More than that, I knew exactly where the electronic files were; but was afraid they wouldn’t be in a readable format. Amazingly, most of the dated folders contained at least 3 files: allFeb.doc, Feb99_1.p65, and 9902.pdf (http://1drv.ms/1B7kVBD).

I now had both a Microsoft Word (97-2003) file of the collected articles, as submitted, and a PDF of the finished newsletter. It’s been more than 15 years, but I was able to come up with a readable sample of his writing in just a few minutes.

How did this happen?

1) I could find it. Not only did I keep it in an orderly file structure, but I knew where those files were likely to be. Since home computers first came with hard drives, all my household’s data have been saved to a single logical area on a single physical disc. As new computers and technology came along, the data were migrated intact to the new drive in the same location.

I learned long ago that storage was cheaper than organization. When the PC finally drove my typesetting business into the ground in 1995, I had accumulated 2,000 to 3,000 floppy discs(1) on the shelf with all of my clients’ jobs for almost 15 years; from resetting a headline to an entire catalog or complex form. Any file was accessible if I had a single identifying number, which was often built into the finished print.

2) It was physically available. With every new computer, I copied the files to it. I know that the disc spins and the bits are still readable. For many files, I still have my previous computer, although it has not been powered on for over five years, now.(2) But I use Carbonite, a reliable, online(3), commercial backup service.

3) I could read the file format. By virtue of it’s ubiquity and longevity, Word .DOCs are still accessible by most modern word processors. While I wouldn’t count on Microsoft continuing to support it in another five years (it was superseded with Office 2007 and they are enforcing their standard 10-year end-of-life), there are a number of other programs that read it now. With even commercial software now being delivered by download, I’m also keeping the installation files for software on that cheap storage. Hopefully I’ll be able to reinstall an old version if I need it; as long as the x86 instruction set survives.

If anything, the .PDF format is even more universal than .DOC with many programs, including most browsers, now incorporating a reader. And I can always do a new install of Adobe Reader 9 from my archives.

Non Sequitor:

(1)     Those 3,000 floppy discs represent barely 500 MB of data. That reflects the efficiency of storing data in a time before multi-terabyte hard drives. Some of the documents included design complexity to rival what a good secretary would do in a word processor or the word count of a small newspaper, but it was stored as simple codes that gave the printer instructions as to font, size, style, and location to put on the document. It also did not include any images. The color photos alone in an 8-page brochure today could easily add up to that 500 MB.

(2)     It may sound like a compulsive waste of space, but I once thought I might need to recalculate a tax return from many years previous. Although I had the original CD for the software, it would not install on my new computer. Fortunately, the old computer booted with the program and all its updates as of April 15 of the necessary year. Caveat: I was lucky that the computer booted. Even in mothballs, CDs, hard discs and electronics that have sat on a shelf in the garage or attic can deteriorate fatally. And don’t forget, CD drives are fast becoming dinosaurs.

(3)     Carbonite, and most other backup programs, are usually only for backup, not archival purposes. This means when you delete a file off the source disc, the backup service will delete it from their system as well. (Carbonite will keep files that are no longer on your computer for 30 days and then remove them from their system.)

If you have files that you want to preserve, but may not look at for years, you need to take specific precautions. Some possible options might be:
  • You can keep them on your active hard drive so they continue to be backed up.
  • You can move them to your own offline storage and test them at least annually for accessibility. If you do this, you should replicate them on two different types of media such as CDs and flash drives.
  • Or you could manually copy them to a cloud service that does not sync to a local file, including syncing the fact of deletion. At the moment Microsoft’s and Google’s online storage is free – up to a limit – and can be used without syncing. Remember, though, that even these companies have changed their focus and discontinued services; often with little warning.
  • For the extremely technically competent, some paid backup services can give you detailed control over retention rules. Amazon has such a service that only super geeks are aware of for a pennies per gigabyte per month; but you might have to wait a day or two for them to retrieve your data.

The best solution is probably a combination of more than one of these options. And for the really valuable documents – drafts of your best-seller, masters to your gold record, Howard Hughes’ will naming you – include a classic analog copy: toner or pigment on archival-grade paper. Beware of inexpensive ink-jet printers. Die based inks can fade while pigments used with better photo printers are much longer lived.

To preserve non-text content such as images or sounds for generations without having to revalidate them every couple years, the only option is metal. Photos (still or moving) should be saved as color-separated (not an amateur process) silver on a stable base. Classically this is referred to as “black and white film negatives.” The copper master disc for pressing an LP should be sufficient for audio recordings. This is basically the strategy NASA used when they shot the world’s “Hello”s to the stars.

Unlike my 1990-version PageMaker digital files, all of these analog media should be readily decodable with the basic software built into most advanced terran life. Extracting the audio may be a little more difficult, but even 20th century technology should be able to come up with a way to turn physical squiggles on a disc into the corresponding sound, even without a turntable.

More information: https://en.wikipedia.org/wiki/Media_preservation

Creative Commons License. This work by Bill Barnes is licensed under a Creative Commons BY-NC-SA 3.0 US License. Permissions beyond the scope of this license may be available at http://zaitech.com/satellite/contacts.htm.
(cc) 2014- Bill Barnes - Disclaimer - Home Page - Blogs Home

Sunday, November 9, 2014

Cellphone supercookies

Verizon and AT&T are adding ‘supercookies’ to your cellphone browsing.

Cookies do not come from Keebler. They are files in your browser that a website asks you to hold and give back to it when it asks for it. When they were conceived soon after the birth of the Web they were an innocuous means for a web server to remember what you, among hundreds of people who may be browsing its pages, are doing. Since then clever programmers have found valuable and sinister ways to use cookies. In response users and browsers took steps that block not just bad, but good cookies and the arms race continues.

Thus is born the supercookie which does not reside in the browser. Generally it is some form of fingerprinting of specific characteristics of your computer. It is easy for a web server to ask the browser to report plug-ins and fonts it knows about and also CPU capability and screen resolution, among other features. It will use these statistics to better customize the web page, graphics, and video it sends you. A half-dozen pieces of information uniquely identifies me out of over 4.5 million computers. The website can then collect this information in a database correlated to personal facts it already knows about.

Recently the popular press has picked up on another type of supercookie being fed us by the cell carriers. Verizon has acknowledged that they’ve added this “feature” since 2012 and it has also shown up on tests of AT&T phones. The technique involves the fact that your cell carrier, like any ISP, is a man in the middle for everything you send out on their network. In this case, they are adding a text identifier to every HTTP transmission you send over cellular data – it is not included if you connect via WiFi.

Verizon’s goal was to allow websites,, for a fee to send them your code and receive some of the plethora of personal data Verizon knows about you. This could include details such as your demographics, phone number, and which store you just walked into at the mall. Unfortunately for Verizon, because the ID is included whether the website subscribes or not, the website could just as easily build their own dossier on that ID. The ID is still attached to your browsing even if you opt out of allowing Verizon to sell your data.

The only way to block this identifier is to make your communications on the cellular network all through a secure channel. They cannot attach the ID to HTTPS browsing. Fortunately major social networking sites such as Facebook, Google, and Twitter use HTTPS all the time. For all the other websites you might visit, your only recourse is to install and use a VPN.

Although Verizon is the only carrier to admit that they include and are monetizing this ID; the technology is available to every cellular company, ISP, or public access site.

Steve Gibson’s Security Now
·         The entire podcast: http://twit.tv/show/security-now/479
·         His show notes and other text: https://www.grc.com/sn/sn-479-notes.pdf
Wired Magazine describes the process
My articles on cookies
EFF fingerprint test
·         https://Panopticlick.eff.org
Creative Commons License. This work by Bill Barnes is licensed under a Creative Commons BY-NC-SA 3.0 US License. Permissions beyond the scope of this license may be available at http://zaitech.com/satellite/contacts.htm.
(cc) 2014- Bill Barnes - Disclaimer - Home Page - Blogs Home