Wednesday, September 16, 2015

What’s a gigabit?

What's a gigabit?


Short answer: it's a billion bits. Thank you. Y'all hurry back.


Honestly, when most people encounter a gigabit it's in the description of their network or internet speed. Usually written Gbps, it means one billion bits per second, which is equivalent to 1000 Mbps or 1000 million (or Mega) bits per second.

The "per second" identifies the distinction between the quantity of information and the rate at which it can be transmitted. Transmission rate is usually expressed in bits while quantity is counted as bytes. Although a byte (the common unit of file size) is equal to 8 bits, communication overhead means the byte rate is approximately 1/10 the bit rate. Thus, to a first approximation, 1 Gbps might move data at about 100 MBps (note that bits are abbreviated with a lower case “b” while bytes are a capital “B”).

So, how fast is a Gbps?

Internet Service Providers often describe their speed in terms like “download a movie (or song) in x number of seconds.” Except that most people don’t care too much how long it takes to download a file. They want the movie they’re watching right now to play smoothly. With streaming media, they download a movie over two hours. One second of HD streaming video might require 4-5 megabits of data (cutting edge “4K UHD” video might be 25 megabits). Even doubling the numbers for a 100% reserve, that’s a long way from 1000 Mbps.

“Your mileage may vary…”

Despite the sales pitch, a promotion for 1 Gbps might not give 1 Gbps. Just about every internet package is described as Up to xxx Mbps.” This implies they’re not actually guaranteeing any particular minimum speed. There are many factors more or less beyond the ISP’s control that affect the actual internet speed delivered to an end device.

Alert: tech talk coming up. You may want to jump to “What else is on the line?

Obviously, the ISP has no control over the connections or devices inside the house. Until recently most computers shipped with wired network cards that wouldn’t communicate faster than 100 Mbps. While the computers’ connections got faster, upstream devices such as home routers may not have been upgraded. Some older routers were limited to only 10 Mbps input from the internet so internal computers could communicate much faster between themselves than with the internet. Ramping up speed with WiFi technology has come later and at much higher cost than for wired connections. Also, most WiFi specifications share a single total capacity with all devices connected to it so any one device’s speed is limited by what others may be doing, even if they're not using the internet.

Before it gets to the subscriber’s equipment, the ISP’s local modem may not support their maximum speed. Outside the house, there are mitigating factors for some neighborhood transmission technologies. On some systems all subscribers may share a single cable from a local hub. While the cable may be capable of more than the advertised speed, it may not be so if everyone is using it at the same time. Other systems may give a single wire to a single subscriber, but the potential speed falls off with distance from the hub. And at every hub from the user to the ISP’s connection to the internet more users are vying for a finite amount of capacity.

Leaving the ISP does not mean clear sailing for maximum speed. The web page still has to go through up to a couple dozen routers; any of which could be technologically limited, failing, or overloaded and slowing the connection. At the ultimate server the same failings on that end could slow down communications.

Another fly in the mud is the complexity of what’s in the content. With the proliferation of rich web page advertising some pages may contain content from scores of servers all over the world. This material has to be requested individually by the computer, each one going through its own gauntlet mentioned above. Some pages downloaded over excellent connections can take up to 45 seconds to complete even before the streaming services that are the goal of the connection start.

What else is on the line?

If the carrier really delivers a consistent 1 Gbps to the doorstep and all the subscriber equipment is up to the speed, there’s more overhead that could nibble away at the best rate. Services such as telephone or security systems may be constantly consuming capacity. Computers and other devices (is the refrigerator talking to the supermarket yet?) may not be friendly about when they request their updates (corporate internet services have been slowed to a crawl on days that smart phones got an update – without anyone requesting or expecting it). Online data backup and synchronization services need to move large amounts of data and they don’t want to wait until overnight in case it’s needed before then.

But the real bandwidth hogs are the subscription services many assume are separate from the internet connection. Streaming media – audio, and especially, video delivered on demand – consume capacity immediately and continuously. And, the higher quality delivered, the more capacity needed.

Television, even traditional television channels, is decreasingly being viewed via broadcast – one signal delivered to every viewer. Instead, the “tuner” is located at the provider’s offices and each subscriber receives a dedicated stream of the program; even if everyone on the block is watching the same football game. “Digital recording” works the same way with all the recordings and stop points stored in a database and generated from a central server on demand.

How much is enough?

Visualize 3 televisions or computers in the house in use at once. Add another stream for each simultaneous channel a “traditional,” on premise DVR is recording. That’s 5 Mbps each. Audio streaming is about one-half Mbps per channel. Online gaming is indeterminate, but allow 1 Mbps because it demands immediate response. Add up to a couple more Mbps for incidental services, mail, and web surfing as these demands are typically intermittent.

Total everything up and double it for reserve and future demands to get a conservative number of need. Now match need to affordability. Currently, depending on package and promotion, 50 Mbps may cost $35-$120. If that cost is too high, most streaming services will automatically adjust their quality to the available bandwidth. Most people probably won’t notice the first couple steps back from ultimate quality in most cases. Also, ask the provider if the extra cost TV package consumes bandwidth already billed for. Then ask them again and write down their name.

If the cost for 50 Mbps is not excessive, then consider the upgrade. In some areas 1 Gbps is only $20 more than 50 Mbps. In most areas, 1 Gbps may be promoted but is actually pie-in-the-sky.

And still more gotchas.

While packages may be sold as 50 Mbps or 1 Gbps, these are download speeds. Most residential plans only offer 1 Mbps upload. Web surfing and media streaming have minimal and intermittent upload demands so 1 Mbps is sufficient. Online backups, synchronization, and media sharing may take longer to complete, but are rarely timely. However, more consumers are using two-way audio and video communications which may quickly saturate this capacity; especially if they originate a conference call. Unfortunately, greater upload speeds are mostly available only with business class packages, which are often much more expensive.

All of these discussions have only been concerned with rate of connection with no mention of total quantity of data moved. While most US wired ISPs have not (yet) started metering quantity, most cellular plans do. Cellular may offer up to 25 Mbps, a continuous download at that rate will burn through a 2 GB (gigabyte) plan in 10 minutes. A standard DVD movie (not Blu-ray) runs about 4 GB or an hour of good quality audio is 30 MB.


References:
Mega - Giga, etc.:     https://en.wikipedia.org/wiki/Gigabit


Creative Commons License. This work by Bill Barnes is licensed under a Creative Commons BY-NC-SA 3.0 US License. Permissions beyond the scope of this license may be available at http://zaitech.com/satellite/contacts.htm.
(cc) 2014- Bill Barnes - Disclaimer - Home Page - Blogs Home

Thursday, July 2, 2015

Is LastPass Hacked?


In the middle of June 2015, the password manager LastPass sent a message to their users announcing that their internal security had been breached and a some tens of thousands of records from one of their databases had been stolen.

Yeah, that’s technical PR talk for “We been hacked!”

Does this mean LastPass is worthless? Should you stop using it? Should you change your password?

Answers: No, No, and Maybe.

If your LP master password was weak, you definitely should change it. And if you used your LP master password anywhere else, you need to change every other site you used it.


A “weak” password is anything that looks like it might have come from a dictionary of any major language, including char@ct3r substitutions or random capitaliZation. A strong password should be at least 15-20 characters long, truly random, and include all four character types.

You can get a quick evaluation of how good your password might be at https://www.grc.com/haystack.htm. For randomness without any unconscious human prejudices, use a good password generator such as several available at grc.com or the one built into LastPass.

For more technical details on this topic, read on here.

What did LastPass lose?
Apparently records were stolen for a small number of their subscribers from a server containing user names, a hash of the user passwords, and the per-user salt used to create the hash.

A hash ensures that bad guys can’t just log in somewhere with the information they stole but have to decrypt your actual password from what they have. The fact that LastPass has a per-user salt prevents them from brute-forcing a dictionary once and comparing the results to their whole take. Instead, they have to individually brute-force (try every possible character combination) each user because the same password for multiple users will result in a different hash.

And now they have access to my account?
Now they can start attacking one person’s account, except that LastPass threw them another delaying tactic. Instead of hashing your password once, or 500 times; they hash it 100,000 times before they save it. This requires anyone trying to test the password they guessed against the hash they stole to spend microseconds on each try rather than picoseconds. Even with specialized computers, they can only test a few thousand possible passwords per second.

Thousands of passwords per second! I’m toast!”

Not necessarily. A simple 6-character password like aaa&1B has 750 billion possible combinations. At 100,000 guesses per second, it could take over 40 days to come up with a match. And that match allows them to break into one account. They have no way of knowing whether the account BoyObama will give them nuclear codes or a teenager’s Twitter account.

Since you have one 12-character password out of half-a-septillion combinations it could take seven times the age of the universe to crack.


References:
How many combinations:     https://www.grc.com/haystack.htm
And the number is called:     https://en.wikipedia.org/wiki/Metric_prefix

Creative Commons License. This work by Bill Barnes is licensed under a Creative Commons BY-NC-SA 3.0 US License. Permissions beyond the scope of this license may be available at http://zaitech.com/satellite/contacts.htm.
(cc) 2014- Bill Barnes - Disclaimer - Home Page - Blogs Home

Wednesday, July 1, 2015

Data for the ages


While publishing an obituary for a long-time member in a club newsletter, several people mentioned that he had a regular column in the newsletter years before. We thought it would be nice to share his representative writing with those members who remembered it as well as those who never met him.

During much of that period I was editor of that auspicious publication. I knew I had drafts of most monthly issues. More than that, I knew exactly where the electronic files were; but was afraid they wouldn’t be in a readable format. Amazingly, most of the dated folders contained at least 3 files: allFeb.doc, Feb99_1.p65, and 9902.pdf (http://1drv.ms/1B7kVBD).

I now had both a Microsoft Word (97-2003) file of the collected articles, as submitted, and a PDF of the finished newsletter. It’s been more than 15 years, but I was able to come up with a readable sample of his writing in just a few minutes.

How did this happen?

1) I could find it. Not only did I keep it in an orderly file structure, but I knew where those files were likely to be. Since home computers first came with hard drives, all my household’s data have been saved to a single logical area on a single physical disc. As new computers and technology came along, the data were migrated intact to the new drive in the same location.

I learned long ago that storage was cheaper than organization. When the PC finally drove my typesetting business into the ground in 1995, I had accumulated 2,000 to 3,000 floppy discs(1) on the shelf with all of my clients’ jobs for almost 15 years; from resetting a headline to an entire catalog or complex form. Any file was accessible if I had a single identifying number, which was often built into the finished print.

2) It was physically available. With every new computer, I copied the files to it. I know that the disc spins and the bits are still readable. For many files, I still have my previous computer, although it has not been powered on for over five years, now.(2) But I use Carbonite, a reliable, online(3), commercial backup service.

3) I could read the file format. By virtue of it’s ubiquity and longevity, Word .DOCs are still accessible by most modern word processors. While I wouldn’t count on Microsoft continuing to support it in another five years (it was superseded with Office 2007 and they are enforcing their standard 10-year end-of-life), there are a number of other programs that read it now. With even commercial software now being delivered by download, I’m also keeping the installation files for software on that cheap storage. Hopefully I’ll be able to reinstall an old version if I need it; as long as the x86 instruction set survives.

If anything, the .PDF format is even more universal than .DOC with many programs, including most browsers, now incorporating a reader. And I can always do a new install of Adobe Reader 9 from my archives.

Non Sequitor:

(1)     Those 3,000 floppy discs represent barely 500 MB of data. That reflects the efficiency of storing data in a time before multi-terabyte hard drives. Some of the documents included design complexity to rival what a good secretary would do in a word processor or the word count of a small newspaper, but it was stored as simple codes that gave the printer instructions as to font, size, style, and location to put on the document. It also did not include any images. The color photos alone in an 8-page brochure today could easily add up to that 500 MB.

(2)     It may sound like a compulsive waste of space, but I once thought I might need to recalculate a tax return from many years previous. Although I had the original CD for the software, it would not install on my new computer. Fortunately, the old computer booted with the program and all its updates as of April 15 of the necessary year. Caveat: I was lucky that the computer booted. Even in mothballs, CDs, hard discs and electronics that have sat on a shelf in the garage or attic can deteriorate fatally. And don’t forget, CD drives are fast becoming dinosaurs.

(3)     Carbonite, and most other backup programs, are usually only for backup, not archival purposes. This means when you delete a file off the source disc, the backup service will delete it from their system as well. (Carbonite will keep files that are no longer on your computer for 30 days and then remove them from their system.)

If you have files that you want to preserve, but may not look at for years, you need to take specific precautions. Some possible options might be:
  • You can keep them on your active hard drive so they continue to be backed up.
  • You can move them to your own offline storage and test them at least annually for accessibility. If you do this, you should replicate them on two different types of media such as CDs and flash drives.
  • Or you could manually copy them to a cloud service that does not sync to a local file, including syncing the fact of deletion. At the moment Microsoft’s and Google’s online storage is free – up to a limit – and can be used without syncing. Remember, though, that even these companies have changed their focus and discontinued services; often with little warning.
  • For the extremely technically competent, some paid backup services can give you detailed control over retention rules. Amazon has such a service that only super geeks are aware of for a pennies per gigabyte per month; but you might have to wait a day or two for them to retrieve your data.

The best solution is probably a combination of more than one of these options. And for the really valuable documents – drafts of your best-seller, masters to your gold record, Howard Hughes’ will naming you – include a classic analog copy: toner or pigment on archival-grade paper. Beware of inexpensive ink-jet printers. Die based inks can fade while pigments used with better photo printers are much longer lived.

To preserve non-text content such as images or sounds for generations without having to revalidate them every couple years, the only option is metal. Photos (still or moving) should be saved as color-separated (not an amateur process) silver on a stable base. Classically this is referred to as “black and white film negatives.” The copper master disc for pressing an LP should be sufficient for audio recordings. This is basically the strategy NASA used when they shot the world’s “Hello”s to the stars.

Unlike my 1990-version PageMaker digital files, all of these analog media should be readily decodable with the basic software built into most advanced terran life. Extracting the audio may be a little more difficult, but even 20th century technology should be able to come up with a way to turn physical squiggles on a disc into the corresponding sound, even without a turntable.

More information: https://en.wikipedia.org/wiki/Media_preservation

Creative Commons License. This work by Bill Barnes is licensed under a Creative Commons BY-NC-SA 3.0 US License. Permissions beyond the scope of this license may be available at http://zaitech.com/satellite/contacts.htm.
(cc) 2014- Bill Barnes - Disclaimer - Home Page - Blogs Home

Sunday, November 9, 2014

Cellphone supercookies



Verizon and AT&T are adding ‘supercookies’ to your cellphone browsing.

Cookies do not come from Keebler. They are files in your browser that a website asks you to hold and give back to it when it asks for it. When they were conceived soon after the birth of the Web they were an innocuous means for a web server to remember what you, among hundreds of people who may be browsing its pages, are doing. Since then clever programmers have found valuable and sinister ways to use cookies. In response users and browsers took steps that block not just bad, but good cookies and the arms race continues.

Thus is born the supercookie which does not reside in the browser. Generally it is some form of fingerprinting of specific characteristics of your computer. It is easy for a web server to ask the browser to report plug-ins and fonts it knows about and also CPU capability and screen resolution, among other features. It will use these statistics to better customize the web page, graphics, and video it sends you. A half-dozen pieces of information uniquely identifies me out of over 4.5 million computers. The website can then collect this information in a database correlated to personal facts it already knows about.

Recently the popular press has picked up on another type of supercookie being fed us by the cell carriers. Verizon has acknowledged that they’ve added this “feature” since 2012 and it has also shown up on tests of AT&T phones. The technique involves the fact that your cell carrier, like any ISP, is a man in the middle for everything you send out on their network. In this case, they are adding a text identifier to every HTTP transmission you send over cellular data – it is not included if you connect via WiFi.

Verizon’s goal was to allow websites,, for a fee to send them your code and receive some of the plethora of personal data Verizon knows about you. This could include details such as your demographics, phone number, and which store you just walked into at the mall. Unfortunately for Verizon, because the ID is included whether the website subscribes or not, the website could just as easily build their own dossier on that ID. The ID is still attached to your browsing even if you opt out of allowing Verizon to sell your data.

The only way to block this identifier is to make your communications on the cellular network all through a secure channel. They cannot attach the ID to HTTPS browsing. Fortunately major social networking sites such as Facebook, Google, and Twitter use HTTPS all the time. For all the other websites you might visit, your only recourse is to install and use a VPN.

Although Verizon is the only carrier to admit that they include and are monetizing this ID; the technology is available to every cellular company, ISP, or public access site.

---------------
References:
Steve Gibson’s Security Now
·         The entire podcast: http://twit.tv/show/security-now/479
·         His show notes and other text: https://www.grc.com/sn/sn-479-notes.pdf
Wired Magazine describes the process
My articles on cookies
EFF fingerprint test
·         https://Panopticlick.eff.org
 
Creative Commons License. This work by Bill Barnes is licensed under a Creative Commons BY-NC-SA 3.0 US License. Permissions beyond the scope of this license may be available at http://zaitech.com/satellite/contacts.htm.
(cc) 2014- Bill Barnes - Disclaimer - Home Page - Blogs Home

Sunday, October 5, 2014

The slippery slope


I fell into a well. I knew it was there. The field is pockmarked with many wells and I knew they were out there. Some are camouflaged while others have a big sign that says “jump in here!” Some of the shallower ones are actually more dangerous.

The well I landed in is one of the deepest, but, hopefully, one of the less dangerous. Even so, I caught myself near the top and set a bosun’s chair, but it keeps slipping farther down the well.

The well is called an ecosystem and its purpose is to ensure that once you are in one company’s ecosystem, you will consume more and more of their products to the exclusion of their competitors.

In personal computing the first serious ecosystem competition was Apple vs Microsoft. Once you made a commitment to one operating system or the other, your choice of software was pretty much determined with little overlap. With the beginning of broadly available online connectivity the battle was between networks such as AOL and CompuServe which initially couldn’t trade email. Now the competing ecosystems are the likes of Amazon and EBay for merchandise and Facebook and Google for everything else.

Why does business need an ecosystem? It’s branding to the nth degree. When I was growing up, you were either a Chevy or a Ford person. Later it was Coke or Pepsi. Loyalty to a name could ensure prosperity for a company, independent of the quality of the product. Now it’s “do you live on a wall or in a hangout?”

Say you want an e-reader with a mostly broad and reliable supply of books. You download the Kindle apps and register for an account to buy books and synchronize your desktop reader with your phone’s. The next best seller you buy “you can get the Audible version too for $3.” And, “this book was made into a movie – watch it on Prime.” Later you need a toaster for a cousin’s wedding – order from Amazon because you get free shipping. That’s an ecosystem.

The ecosystem I fell into is Google. Beware the credo of the internet that “if you can’t figure out what the website is selling, you are the product.” Google delivers us to its advertisers. More than that, it delivers our profile to its advertisers.

Early in the commercialization of the web online advertising was like magazine advertising. A site might attract sci fi junkies or wine aficionados, but if one person moved from one site to the other there was no way to know it was the same person. Then along came DoubleClick. They realized if everyone had ads from them, they could read their own cookies regardless of who owned the content. Then they would know that I drink wine, watch Dr Who, and also am shopping for a snowmobile. So, I get skiing ads on Wine Spectator and comiccon ads at Eddie Bauer.

Google’s got a pot of money and is looking for synergistic businesses to buy. So they pick up DoubleClick and then YouTube (lots of interest-specific profiling to do there). Hop over to their core product and what takes up the prominent position in any search? Ads. Ads that not only apply to your current search, but also all of your web surfing.

They also created an email service where people spend lots of time and provide a pretty decent online office suite. Of course, to use those personalized services, you have to sign in to their system. For convenience, one sign in gives you access to all these services and leave the “keep me signed in” box checked so you don’t even have to enter your password every time you restart your browser. Now your searches are not just an anonymous cookie, but you with a detailed profile with a name, email address, chronic diseases, and more. Don’t worry, Google’s motto is “Don’t be Evil.”

How do I cope with the ecosystem?


I take the effort to uncheck “keep me signed in” and try to remember to sign out when I’m done. I avoid logging into other sites while logged into high value sites (financial or personal information). I have four browsers and never sign in to any account from two of them. I seek out my browser’s configuration to ensure “do not track” is enabled and third party cookies are disabled. I also set all cookies to be cleared when I close the browser – but that can be a real nuisance sometimes. I use the Firefox plugins Ghostery to alert me who (besides the site I actually went to) is watching what I do and NoScript to ensure those third parties can’t sneak malicious or tracking code onto the pages I’m viewing.

By the way, if you carry a smartphone, you’re permanently in Google’s or Apple’s ecosystem (or Microsoft’s for a couple of you). This is in addition to Verizon’s and ATT’s ecosystem, or whoever your carrier is, which has been true as long as there have been portable phones. You might also be in Samsung’s or Amazon’s or HTC’s ecosystem if the phone manufacturer chooses to watch over you for more than system upgrades.

If you’ve installed an app from Facebook, Twitter, or a myriad of others; they also could be watching over you even if you’re not actively using the app. And now some retailers and entire malls have technology that can identify the radio signals your phone is constantly putting out to track you from sweaters to socks or from Gap to Banana Republic to Sears Automotive.

The only way to stay out of the well is to stay out of the field. But we know that means living in the 20th century. Why did we so expectantly await the future?


NOTE: Products and companies are named as representative. It is not my intention to imply any one person or company is better or worse than any other.


Creative Commons License. This work by Bill Barnes is licensed under a Creative Commons BY-NC-SA 3.0 US License. Permissions beyond the scope of this license may be available at http://zaitech.com/satellite/contacts.htm.
(cc) 2014- Bill Barnes - Disclaimer - Home Page - Blogs Home

Tuesday, April 29, 2014

The power of notoriety

Heartbleed and you
If you don't see this, you're using an old version of IE. Read "If you really want to keep your Windows XP." (April 2014)

Just after I posted my item on The Second Factor (1), I saw a syndicated story under the heading of “Double-layer passwords offer additional protection online.” It started out “If the Heartbleed security threat teaches us anything, it’s that passwords don’t offer total protection.” (2)

While that was a good article on why and how multi-factor authentication is valuable, its reference to Heartbleed was as valid a lead as a picture of a scantily-clad model.

Don’t get me wrong. Both topics are important considerations for your security online. But I haven’t seen any clear lay explanation as to the risk and impact of Heartbleed on the ordinary user.

First the technical details. Heartbleed was a flaw in OpenSSL, a component of certain webservers, from 2012 until April 9, 2014. Depending on whom you listen to, this flaw could have affected between 66% and 17% of all websites on the internet. I’m inclined to lean toward the lower number.(*) But this is still serious since the count is of sites, which included the likes of Google and Yahoo.

Due to the high risk from the flaw, OpenSSL team had issued a patch within 72 hours of being notified and less time from any broad awareness. Most of the larger secure sites implemented the patch immediately. In fact, I first became aware of the flaw when I started getting notices from my banks that they either were never vulnerable, or had already patched their system.

What does Heartbleed do? The flaw allows a hacker to induce the server to send him the contents of a small portion (64 kilobytes) of its memory (remember that a webserver will have 8-64 gigabytes of memory). The affected memory would contain random bits of the server’s recent activity.

This memory could be the contents of its own webpages which anyone could view. It could be the computer language instructions to manage the website. It could also be bits of the conversation between your browser and the server that establishes your secure connection in advance of telling your bank to pay a bill.

All the hacker has to do is get the same server to send him a few thousand 64 KB downloads. Then he has to scan through the mostly binary data for the flecks of gold that are recognizable. Once found, he has to refine those flecks into real knowledge that he can exploit for value. (If that sounds tedious, all of the tasks can be automated.)

What is the risk to you? It’s possible those flecks of gold may include your account name and password. But if the hacker’s goal is passwords, that’s an inefficient way to get them. Every week credentials are being stolen in million-account lots through other security lapses and flaws.

A far more valuable nugget to look for is the webserver’s master key to all its SSL/TLS communications. If a hacker has this, he can create a fake website that your browser will accept as authentic. Then he can execute a perfect phishing or man-in-the-middle attack against any visitor to his bogus site. He can also decrypt previous “secure” traffic to most sites. Of course, the latter two attacks require the hacker either be in the middle or have access to previously recorded internet traffic.

What should you do? Unfortunately, because of how the flaw works, there is no way to know that a specific site has, or has not, been hacked. If your partner has advised you that they have eliminated any risk from this flaw, you should change your password for that site. Take this opportunity to use a strong and unique password for each of your high value web accounts. If available, you might enable 2-factor sign-on to reduce the possibility of an account being hijacked.

Once a site has been patched, they should have received a new SSL certificate and revoked their old, compromised certificate. Unfortunately, as of this writing, there is no reliable way to ensure you know that you aren’t accepting a stolen certificate. Some browsers, maybe with some deep settings, will warn you that a certificate has been revoked. There is one site that will test whether your browser properly recognizes a revoked certificate. There is one known website that can actually serve you a revoked certificate. If you go to http://revoked.grc.com(3), you should receive an error. If your browsers test good and give you the error, you can read more about revocation at(4).

There’s more at risk than just websites. Although I have not seen an authoritative list, SSL is by far the dominant method of protecting electronic communication on the internet. Potentially vulnerable services run the gamut from a sophisticated private VPN to the heavily consumer cloud storage services. They could also include the likes of email, chat and VoIP, or routers for both home use and controlling the internet.

Unfortunately, many of these services are either embedded deeply into the technology or are never managed again after the original configuration. They will be patched slowly or not at all.

Fortunately, as the variety of programs for exploit increases, the number of clients shrinks. If the host in a peer-to-peer network serving two nodes is invaded, it could be devastating for those two, but will not affect anyone else.

Like the number of potentially vulnerable webservers, many of these services are not at risk from Heartbleed because their communications are not encrypted in the first place. Your chat, email, and cloud backups have been coursing through the internet as plain text; easily readable by anyone with a tap on the line – and I don’t mean just governments.

Bill Barnes with Dewey Williams, PCCC

----------
Notes
(*) The reported risk to 66% of all websites refers to the number of websites that are running webservers that might use OpenSSL. These are primarily the programs Apache and nginx.
This number has to be reduced by the large number of websites that don’t even offer SSL. Again subtract the number what did not install the affected versions of OpenSSL and you get a much lower percentage of the Web. However, with worldwide web sites numbering in the 9 digits (decimal); whether the affected percentage is 30% or 10%, it’s still a huge number. (5)


References
(1)
Blog post “The Second Factor”. http://fromthehelpdesk.blogspot.com/2014/04/the-second-factor.html
(2) AP article “Double-layer passwords …” read in The Charlotte Observer. http://www.charlotteobserver.com/2014/04/26/4865676/tech-tips-double-layer-passwords.html#storylink=cpy
(3) Test website for a revoked certificate. http://revoked.grc.com
(4) Explanation of revoked certificates. https://www.grc.com/revocation.htm (also on podcasts referenced below)
(5) The number of websites truly at risk. http://news.netcraft.com/archives/2014/04/08/half-a-million-widely-trusted-websites-vulnerable-to-heartbleed-bug.html.

More references
An early announcement on Heartbleed. http://pc3.org/heartbleed-bug-affects-60-of-secure-internet-servers/
Text and podcasts on Heartbleed. https://grc.com/sn


Creative Commons License. This work by Bill Barnes is licensed under a Creative Commons BY-NC-SA 3.0 US License. Permissions beyond the scope of this license may be available at http://zaitech.com/satellite/contacts.htm.
(cc) 2014- Bill Barnes - Disclaimer - Home Page - Blogs Home

Sunday, April 27, 2014

The Second Factor

The Second Factor

Sometimes when creating or logging into an online account the system will ask for a phone number or a second email address. Recently my users are asking me “why do they want that?” One user ignored the request so many times the system locked her out of a portion of her account until she provided it.

This alternate point of contact is called a second factor authentication and is a means for the website to verify that you are the person who signed up for the account. It is similar to your bank asking for the last digits of your Social Security Number or the doctor’s office wanting your date of birth. These are bits of information that they know came from you and should be different from anyone else who might share your name or other primary login.

This is not the same as when a website shows you a picture of the Statue of Liberty or a Corvette after you’ve logged in. With that the website is proving its identity to you because an imposter would not know which picture you are expecting. Second factor authentication allows you to prove you are you to the website.

If the website offers second factor, it’s a good thing. Imagine if someone were looking over your shoulder and stole your password. Then they could login as you and change your settings such that you are no longer getting notifications from the site. If it were a shopping site with a memorized credit card, you might not know what they are buying until you get the bill.

Typically the second factor will send you a one-time code that you must enter before proceeding. Check your email, type 4-6 digits or click a link, and you’re in. Often it will set a cookie in your browser and not inconvenience you even that much every time.

Ideally, the second factor should be delivered out of band – that is, through a different network than you used for your first factor. An excellent option is to send you the code for a website by cell text or voice telephone. If instead of looking over your shoulder, someone stole your computer, he might have access to your email as well as the website.

Second factor is more reliable than asking how many sisters you have or which high school you went to. Someone who’s gone to the trouble of stealing your identity could also find out that information. Instead it relies on responding with unique real-time information delivered to a device you would likely not lose at the same time as losing your computer.

If you provided the second factor channel (such as your cell phone number) at the time that you created the account, there is no way it could be hijacked. You’re well on your way to accomplishing the triumvirate of identity: something you know, something you have, something you are. That is: your logon and password (both something you know), your cell phone or a dongle (something you have), and your biometrics (like a fingerprint reader).


Creative Commons License. This work by Bill Barnes is licensed under a Creative Commons BY-NC-SA 3.0 US License. Permissions beyond the scope of this license may be available at http://zaitech.com/satellite/contacts.htm.
    (cc) 2014 Bill Barnes - Disclaimer - Home Page - Blogs Home

Saturday, April 5, 2014

If you really want to keep your Windows XP


These tips are in no particular order. Note that some tips may require also following other tips that might come after or before them.
  • Always log on as a Limited User unless you absolutely must update some software.
    Lack of administrator rights blocked over 90% of the Windows OS* malware in 2013.
  • Keep all your software and applications up to date. Make a list of programs that need regular updates and check for updates at least monthly.
  • Don’t use Internet Explorer; install the latest versions of Opera, Chrome, or Firefox.
  • Install and use the NoScript and Ghostery plugins for Firefox.
  • Uninstall JAVA. At least, disable it in all browsers.
  • Uninstall or restrict use of Adobe products. A recommended alternate PDF reader is Sumatra (I have not used it). Use the built-in readers in Chrome or Firefox instead of a plugin. 
  • If downloading an Office document, preview it in a viewer instead of the full program. Disable any macros.
  • Uninstall Microsoft Security Essentials and use a 3rd party antivirus such as the free options from Avast, AVG and others.
  • Upgrade to Microsoft Office 2007 or newer. Better still, move to a non-Microsoft suite.
  • Upgrade to Internet Explorer 8 (the highest level that works with XP).
  • Don’t access the internet (including email) from your XP computer. Don’t install unknown software downloaded from the internet by other computers.
  • If you must browse the web, restrict the ability of malware to get to you:
    • Ensure you are behind a router – the first-line firewall – and that Windows firewall is active.
    • Configure your email reader to display only text – no pictures or links.
    • Use Firefox with NoScript. Learn the controls in NoScript and don’t casually allow everything.
    • Browse only to sites you are familiar with.
  • If you must use email on XP, restrict the ability of malware to get to you:
    • Use webmail. In particular, gMail online is practically immune to transmitting malware to your system.
    • Use a mail client other than Outlook or Outlook Express.
    • Configure your mail client to display messages as “text only.”
    • Do not open email attachments or follow links until you have independently verified with the sender they are benign. Read our article on evaluating an email.
  • Shut your computer off when not using it.
    You may discover you have very little need for XP. Plus, older computers are less efficient and you’ll save on your energy bill. 
* Logging on as a Limited User will block most malware that attacks flaws in and installs to the Windows operating system. This does not include malware that attacks flaws in individual programs such as JAVA, email, Microsoft Office, or .pdf documents.

Additional References
Some of these references are documents and must be downloaded and viewed in their program. Yes, they're safe for XP.
PC Club of Charlotte’s original presentation
http://pc3.org/smfpc3/index.php/topic,266.0.html and
http://zaitech.com/articles/misc/download_documents/TheEndOfXP.docm

Security researcher Steve Gibson’s comments:
https://www.grc.com/sn/sn-447-notes.pdf (first page) and
http://twit.tv/show/security-now/447.

                 
Creative Commons License. This work by Bill Barnes is licensed under a Creative Commons BY-NC-SA 3.0 US License. Permissions beyond the scope of this license may be available at http://zaitech.com/satellite/contacts.htm.
(c) 2014 Bill Barnes - Disclaimer - Home Page - Blogs Home




Friday, September 6, 2013

Ctrl-Shift-P – your new best friend

Do you flip among reading webmail, social networking, gossip sites, research, and shopping or banking in the same browser? Do you realize that browsing history, search details, and even some logon information could leak from site-to-site?

Google – and, not to pick on just them, many other providers – encourages you to sign in to your account “for a fuller browsing experience” as soon as you open your browser. What they’re really wanting is to establish a relationship – ie: a cookie – with you so they can follow what you’re doing and suggest (sponsored) alternatives to your current choice. This tracking capability is not hidden, nefarious, or necessarily malicious. It’s designed into the Web and browsers as well as your user agreement, and can only be thwarted with obscure configurations in each browser’s profile.

I used to alleviate my concerns over this leakage by ensuring I closed all my browser sessions and reopened the browser before doing any financial transactions. One time I was trying to sign in to a major shopping site with a different profile than I usually used. No matter which of my usual tricks I used, it still insisted on pulling up my personal profile. Obviously they were tracking me with multiple cookies from multiple domains and I would have had to completely clear my history from that browser to get a fresh login. My only alternate option was to use a company computer from which no one had ever signed into that site.

Most modern browsers now offer some form of “private” or “incognito” browsing. A private session is a pristine instance of the browser with no history, no cookies, no remembered passwords. When you close the private instance of the browser (the entire window and all child windows – not just logging off the signin), it deletes all record of that session so the details can’t be tracked across other websites. The next time you open a private instance, you’re starting over again.

When private browsing was first introduced it closed or locked out your “regular” instances of that browser so you couldn’t do anything else while in it. Now it functions just like another browser window that you can switch back and forth between. Identify your private session by the notation in the title bar.

Caveats:

A private browser instance only protects you against successive session browser tracking, persistent cookies, and cross-site leakage. It doesn’t stop the web server from fingerprinting(1) your computer or any malware already installed on your computer or the server. All the cookies and history already in your system are still available to the browser – it just makes all cookies you get now “session” cookies. And, if you browse to any other sites in the same private instance, you might as well have used a default instance.

Private browsing is not a sandbox you can use with abandon; just a slight improvement over the wide open web. For better protection against tracking and leakage you need to use a pristine “computer” by booting to a live CD or a clean virtual machine. This works whether doing your banking or surfing to questionable websites – just be sure you reboot in between.

A private browser session is available from the main menu of most browsers. In Firefox and Internet Explorer, you can use the hotkey Ctrl-Shift-P. Chrome and Opera use Ctrl-Shift-N.

(1)      http://Panopticlick.eff.org

Creative Commons License. This work by Bill Barnes is licensed under a Creative Commons BY-NC-SA 3.0 US License. Permissions beyond the scope of this license may be available at http://zaitech.com/satellite/contacts.htm.
(c) 2013 Bill Barnes - Disclaimer - Home Page - Blogs Home

Friday, July 19, 2013

Microsoft abandons retail market

If you don’t follow technology at a certain level, you may have missed the news: Microsoft has told the consumer market to kiss off. That’s not exactly how they said it, but we know that old adage about actions and words.

First they took a page out of the cable TV industry and obfuscated their business pricing such that it is impossible to compare products and pricing. In fact, there is a certification just to support Microsoft licensing and most businesses that don’t shop at Best Buy hire a consultant to advise and manage their relationship. Then they introduced an all online version of their flagship product (yes, Office makes more money than Windows). The revolutionary (meaning Google’s only been doing it for half-a-decade) online and installable Office 365 is subscription only and starts at $9.99 per month. Forever.

That takes care of soaking or eliminating people who just want to write a letter. What about the people who make it possible for small business to run efficiently and economically?

Recently they announced that they are eliminating the primary program most people who support their products but don’t wear a Microsoft shirt use. TechNet is a subscription that gives one or more copies of just about every product that anyone who doesn’t have thousands of users across multiple sites could need – “for evaluation purposes only, not for use in production environments.” Depending on the promotion du jour, I gladly paid $150-$350 per year for the privilege to use 5% of what they offered me. Unfortunately, too many people were getting TechNet and sharing or selling the individual license keys.

My primary business is supporting people who use 1-10 computers. They don’t make their money with their computer; but do have to use their computer to make money. When your client consists of an accountant and her receptionist/account manager/billing clerk; you get calls to do everything from buying a printer cable to establishing a new office. This includes setting up and optimizing their email, preparing a customized mailing, or converting the contents of their client’s database to simplified spreadsheet.

Those tasks require Outlook, Word, Excel, and possibly more Office programs. I feel no guilt using my TechNet versions of these applications because I am supporting an office that paid the highest price for their own copies. I’ll tell you a secret, Microsoft. I could also complete these tasks using apps from Google or Open Office. Those cost me and my client – nothing. More significantly, they earn you nothing.

Microsoft; for every IT professional who gives you thousands of dollars a year to keep a certificate on their wall, there’s 5 or 10 or dozens more who support and promote your retail products. You just turned them into Google and Linux salespeople. Retail may not be the goose that lays golden eggs, but it is a cash cow that gives good milk year-after-year. When the time comes that all your money is from enterprise, it’s HP and Dell that tells them what to buy. Remember, they also sell tablets and laptops as well as management services. If you continue encroaching on their hardware and support businesses; they too may become Linux salespeople.

Creative Commons License. This work by Bill Barnes is licensed under a Creative Commons BY-NC-SA 3.0 US License. Permissions beyond the scope of this license may be available at http://zaitech.com/satellite/contacts.htm.
(c) 2013 Bill Barnes - Disclaimer - Home Page - Blogs Home
 

Tuesday, April 30, 2013

Some people don't count

Always check your coverage - ideally with a real device on the network - before you choose a cell carrier.


Pages