Wednesday, November 11, 2015

Google Fiber Is Coming To Town


 Note: This article has been updated. Scroll to the bottom to see the status in 2019.

Perhaps you’ve heard: Google Fiber selected Charlotte as one of the first nine cities where they will offer their internet and TV service. They have completed their initial surveys, mapped out locations for their networking equipment and begun putting fiber optic cable in the ground. The next step will be running thousands of miles of cable along neighborhood streets to bring service to individual subscribers.

What is Google Fiber?

Google Fiber will be another carrier providing internet and television service to individuals and, presumably, small businesses. As such, they will operate in direct competition with carriers such as Time Warner and AT&T.

Although no specifics are available for Charlotte, service is already available in Austin, Texas. There they offer an option of high speed internet or internet plus TV. Internet only is priced at $70 per month and over 150 channels of TV plus 8-channel recording adds another $60. They also have a basic internet-only service for a one-time installation fee of $300 which can be paid at $25 per month for a year. After the $300 is paid, there are no more charges.

Google’s Gigabit Internet is advertised as “up to1,000 Mbps.” That is approximately 20 times the speed currently advertised by the currently available major providers. Their prices for “up to 50 Mbps” are $35 and $65 for the first year. Google Basic Internet offers “up to 5 Mbps” with no costs after the initial fee. Realistically, most home users with typical usage probably would not notice a significant difference at speeds greater than 25 Mbps. For more information, see my blog post “What’s a gigabit” at http://TechnologyInterpreter.info.

Why should you be interested in Google Fiber?
Telecom and internet providers are notoriously weak in most customer service surveys. The $430 billion Google is known for tackling technological problems with a different viewpoint from traditional players. At the very least, encouraging Google will bring competition to the near monopoly of service currently available, even if you choose not to change your provider.

Furthermore, Charlotte was honored to be chosen by Google immediately after the first three pilot cities. Supporting Google will prove to the rest of the country that Charlotte believes in the 21st century.

What’s next?

Building a completely new infrastructure is a major task and Google will not be able to offer it to the entire city at once. As yet, they have not announced what areas they will start in, but they are collecting addresses to determine what areas show the most interest. Register at https://fiber.google.com/cities/charlotte/ and if enough of your neighbors also do so, you may have another option for internet and TV soon.

NOTE: This item was originally published in The Spirit of Plaza Midwood, Fall 2015.

Update: March 2019

 It’s been 3 ½ years and most Charlotte neighborhoods still don’t have Google Fiber. You might find some answers in this report from radio station WFAE's FAQ City: What Happened To Google Fiber? 

There are two hypotheses that seem feasible to me:

(1) They never intended to provide broad fiber service. Instead, the project is just a ruse to scare the incumbent internet carriers to offer better service. Then their core services including bandwidth-intensive YouTube and apps, as well as their bread-and-butter search would be more responsive for everyone. If so, they have succeeded as both Spectrum and ATT have significantly upgraded their service and the latter is even installing some fiber of its own.

(2) Google miscalculated how complex and expensive building out fiber in a dozen or more cities would be. They flat dropped some of the “second tier” locations and imposed a serious go slow on the others.

In fact, they have not abandoned Charlotte, but are still adding new customers. It appears those new customers are primarily in the gazillions of new apartments going up in millennial-friendly neighborhoods. Those are obviously less expensive to serve where they can get dozens to hundreds of new customers with a single installation. The buildings are likely even prewired so they only have to service the master utility connection.


----------------------
Open links: 
WFAE: https://www.wfae.org/post/faq-city-what-happened-google-fiber

Disclaimers:
The writer has no affiliation with any of the businesses mentioned. Google did not respond to a request for specific details. All information presented is from public resources.

Creative Commons License. This work by Bill Barnes is licensed under a Creative Commons BY-NC-SA 3.0 US License. Permissions beyond the scope of this license may be available at http://zaitech.com/satellite/contacts.htm.
(cc) 2015- Bill Barnes - Disclaimer - Home Page - Blogs Home

Wednesday, September 16, 2015

What’s a gigabit?

What's a gigabit?


Short answer: it's a billion bits. Thank you. Y'all hurry back.


Honestly, when most people encounter a gigabit it's in the description of their network or internet speed. Usually written Gbps, it means one billion bits per second, which is equivalent to 1000 Mbps or 1000 million (or Mega) bits per second.

The "per second" identifies the distinction between the quantity of information and the rate at which it can be transmitted. Transmission rate is usually expressed in bits while quantity is counted as bytes. Although a byte (the common unit of file size) is equal to 8 bits, communication overhead means the byte rate is approximately 1/10 the bit rate. Thus, to a first approximation, 1 Gbps might move data at about 100 MBps (note that bits are abbreviated with a lower case “b” while bytes are a capital “B”).

So, how fast is a Gbps?

Internet Service Providers often describe their speed in terms like “download a movie (or song) in x number of seconds.” Except that most people don’t care too much how long it takes to download a file. They want the movie they’re watching right now to play smoothly. With streaming media, they download a movie over two hours. One second of HD streaming video might require 4-5 megabits of data (cutting edge “4K UHD” video might be 25 megabits). Even doubling the numbers for a 100% reserve, that’s a long way from 1000 Mbps.

“Your mileage may vary…”

Despite the sales pitch, a promotion for 1 Gbps might not give 1 Gbps. Just about every internet package is described as Up to xxx Mbps.” This implies they’re not actually guaranteeing any particular minimum speed. There are many factors more or less beyond the ISP’s control that affect the actual internet speed delivered to an end device.

Alert: tech talk coming up. You may want to jump to “What else is on the line?

Obviously, the ISP has no control over the connections or devices inside the house. Until recently most computers shipped with wired network cards that wouldn’t communicate faster than 100 Mbps. While the computers’ connections got faster, upstream devices such as home routers may not have been upgraded. Some older routers were limited to only 10 Mbps input from the internet so internal computers could communicate much faster between themselves than with the internet. Ramping up speed with WiFi technology has come later and at much higher cost than for wired connections. Also, most WiFi specifications share a single total capacity with all devices connected to it so any one device’s speed is limited by what others may be doing, even if they're not using the internet.

Before it gets to the subscriber’s equipment, the ISP’s local modem may not support their maximum speed. Outside the house, there are mitigating factors for some neighborhood transmission technologies. On some systems all subscribers may share a single cable from a local hub. While the cable may be capable of more than the advertised speed, it may not be so if everyone is using it at the same time. Other systems may give a single wire to a single subscriber, but the potential speed falls off with distance from the hub. And at every hub from the user to the ISP’s connection to the internet more users are vying for a finite amount of capacity.

Leaving the ISP does not mean clear sailing for maximum speed. The web page still has to go through up to a couple dozen routers; any of which could be technologically limited, failing, or overloaded and slowing the connection. At the ultimate server the same failings on that end could slow down communications.

Another fly in the mud is the complexity of what’s in the content. With the proliferation of rich web page advertising some pages may contain content from scores of servers all over the world. This material has to be requested individually by the computer, each one going through its own gauntlet mentioned above. Some pages downloaded over excellent connections can take up to 45 seconds to complete even before the streaming services that are the goal of the connection start.

What else is on the line?

If the carrier really delivers a consistent 1 Gbps to the doorstep and all the subscriber equipment is up to the speed, there’s more overhead that could nibble away at the best rate. Services such as telephone or security systems may be constantly consuming capacity. Computers and other devices (is the refrigerator talking to the supermarket yet?) may not be friendly about when they request their updates (corporate internet services have been slowed to a crawl on days that smart phones got an update – without anyone requesting or expecting it). Online data backup and synchronization services need to move large amounts of data and they don’t want to wait until overnight in case it’s needed before then.

But the real bandwidth hogs are the subscription services many assume are separate from the internet connection. Streaming media – audio, and especially, video delivered on demand – consume capacity immediately and continuously. And, the higher quality delivered, the more capacity needed.

Television, even traditional television channels, is decreasingly being viewed via broadcast – one signal delivered to every viewer. Instead, the “tuner” is located at the provider’s offices and each subscriber receives a dedicated stream of the program; even if everyone on the block is watching the same football game. “Digital recording” works the same way with all the recordings and stop points stored in a database and generated from a central server on demand.

How much is enough?

Visualize 3 televisions or computers in the house in use at once. Add another stream for each simultaneous channel a “traditional,” on premise DVR is recording. That’s 5 Mbps each. Audio streaming is about one-half Mbps per channel. Online gaming is indeterminate, but allow 1 Mbps because it demands immediate response. Add up to a couple more Mbps for incidental services, mail, and web surfing as these demands are typically intermittent.

Total everything up and double it for reserve and future demands to get a conservative number of need. Now match need to affordability. Currently, depending on package and promotion, 50 Mbps may cost $35-$120. If that cost is too high, most streaming services will automatically adjust their quality to the available bandwidth. Most people probably won’t notice the first couple steps back from ultimate quality in most cases. Also, ask the provider if the extra cost TV package consumes bandwidth already billed for. Then ask them again and write down their name.

If the cost for 50 Mbps is not excessive, then consider the upgrade. In some areas 1 Gbps is only $20 more than 50 Mbps. In most areas, 1 Gbps may be promoted but is actually pie-in-the-sky.

And still more gotchas.

While packages may be sold as 50 Mbps or 1 Gbps, these are download speeds. Most residential plans only offer 1 Mbps upload. Web surfing and media streaming have minimal and intermittent upload demands so 1 Mbps is sufficient. Online backups, synchronization, and media sharing may take longer to complete, but are rarely timely. However, more consumers are using two-way audio and video communications which may quickly saturate this capacity; especially if they originate a conference call. Unfortunately, greater upload speeds are mostly available only with business class packages, which are often much more expensive.

All of these discussions have only been concerned with rate of connection with no mention of total quantity of data moved. While most US wired ISPs have not (yet) started metering quantity, most cellular plans do. Cellular may offer up to 25 Mbps, a continuous download at that rate will burn through a 2 GB (gigabyte) plan in 10 minutes. A standard DVD movie (not Blu-ray) runs about 4 GB or an hour of good quality audio is 30 MB.


References:
Mega - Giga, etc.:     https://en.wikipedia.org/wiki/Gigabit


Creative Commons License. This work by Bill Barnes is licensed under a Creative Commons BY-NC-SA 3.0 US License. Permissions beyond the scope of this license may be available at http://zaitech.com/satellite/contacts.htm.
(cc) 2014- Bill Barnes - Disclaimer - Home Page - Blogs Home

Thursday, July 2, 2015

Is LastPass Hacked?


In the middle of June 2015, the password manager LastPass sent a message to their users announcing that their internal security had been breached and a some tens of thousands of records from one of their databases had been stolen.

Yeah, that’s technical PR talk for “We been hacked!”

Does this mean LastPass is worthless? Should you stop using it? Should you change your password?

Answers: No, No, and Maybe.

If your LP master password was weak, you definitely should change it. And if you used your LP master password anywhere else, you need to change every other site you used it.


A “weak” password is anything that looks like it might have come from a dictionary of any major language, including char@ct3r substitutions or random capitaliZation. A strong password should be at least 15-20 characters long, truly random, and include all four character types.

You can get a quick evaluation of how good your password might be at https://www.grc.com/haystack.htm. For randomness without any unconscious human prejudices, use a good password generator such as several available at grc.com or the one built into LastPass.

For more technical details on this topic, read on here.

What did LastPass lose?
Apparently records were stolen for a small number of their subscribers from a server containing user names, a hash of the user passwords, and the per-user salt used to create the hash.

A hash ensures that bad guys can’t just log in somewhere with the information they stole but have to decrypt your actual password from what they have. The fact that LastPass has a per-user salt prevents them from brute-forcing a dictionary once and comparing the results to their whole take. Instead, they have to individually brute-force (try every possible character combination) each user because the same password for multiple users will result in a different hash.

And now they have access to my account?
Now they can start attacking one person’s account, except that LastPass threw them another delaying tactic. Instead of hashing your password once, or 500 times; they hash it 100,000 times before they save it. This requires anyone trying to test the password they guessed against the hash they stole to spend microseconds on each try rather than picoseconds. Even with specialized computers, they can only test a few thousand possible passwords per second.

Thousands of passwords per second! I’m toast!”

Not necessarily. A simple 6-character password like aaa&1B has 750 billion possible combinations. At 100,000 guesses per second, it could take over 40 days to come up with a match. And that match allows them to break into one account. They have no way of knowing whether the account BoyObama will give them nuclear codes or a teenager’s Twitter account.

Since you have one 12-character password out of half-a-septillion combinations it could take seven times the age of the universe to crack.


References:
How many combinations:     https://www.grc.com/haystack.htm
And the number is called:     https://en.wikipedia.org/wiki/Metric_prefix

Creative Commons License. This work by Bill Barnes is licensed under a Creative Commons BY-NC-SA 3.0 US License. Permissions beyond the scope of this license may be available at http://zaitech.com/satellite/contacts.htm.
(cc) 2014- Bill Barnes - Disclaimer - Home Page - Blogs Home

Wednesday, July 1, 2015

Data for the ages


While publishing an obituary for a long-time member in a club newsletter, several people mentioned that he had a regular column in the newsletter years before. We thought it would be nice to share his representative writing with those members who remembered it as well as those who never met him.

During much of that period I was editor of that auspicious publication. I knew I had drafts of most monthly issues. More than that, I knew exactly where the electronic files were; but was afraid they wouldn’t be in a readable format. Amazingly, most of the dated folders contained at least 3 files: allFeb.doc, Feb99_1.p65, and 9902.pdf (http://1drv.ms/1B7kVBD).

I now had both a Microsoft Word (97-2003) file of the collected articles, as submitted, and a PDF of the finished newsletter. It’s been more than 15 years, but I was able to come up with a readable sample of his writing in just a few minutes.

How did this happen?

1) I could find it. Not only did I keep it in an orderly file structure, but I knew where those files were likely to be. Since home computers first came with hard drives, all my household’s data have been saved to a single logical area on a single physical disc. As new computers and technology came along, the data were migrated intact to the new drive in the same location.

I learned long ago that storage was cheaper than organization. When the PC finally drove my typesetting business into the ground in 1995, I had accumulated 2,000 to 3,000 floppy discs(1) on the shelf with all of my clients’ jobs for almost 15 years; from resetting a headline to an entire catalog or complex form. Any file was accessible if I had a single identifying number, which was often built into the finished print.

2) It was physically available. With every new computer, I copied the files to it. I know that the disc spins and the bits are still readable. For many files, I still have my previous computer, although it has not been powered on for over five years, now.(2) But I use Carbonite, a reliable, online(3), commercial backup service.

3) I could read the file format. By virtue of it’s ubiquity and longevity, Word .DOCs are still accessible by most modern word processors. While I wouldn’t count on Microsoft continuing to support it in another five years (it was superseded with Office 2007 and they are enforcing their standard 10-year end-of-life), there are a number of other programs that read it now. With even commercial software now being delivered by download, I’m also keeping the installation files for software on that cheap storage. Hopefully I’ll be able to reinstall an old version if I need it; as long as the x86 instruction set survives.

If anything, the .PDF format is even more universal than .DOC with many programs, including most browsers, now incorporating a reader. And I can always do a new install of Adobe Reader 9 from my archives.

Non Sequitor:

(1)     Those 3,000 floppy discs represent barely 500 MB of data. That reflects the efficiency of storing data in a time before multi-terabyte hard drives. Some of the documents included design complexity to rival what a good secretary would do in a word processor or the word count of a small newspaper, but it was stored as simple codes that gave the printer instructions as to font, size, style, and location to put on the document. It also did not include any images. The color photos alone in an 8-page brochure today could easily add up to that 500 MB.

(2)     It may sound like a compulsive waste of space, but I once thought I might need to recalculate a tax return from many years previous. Although I had the original CD for the software, it would not install on my new computer. Fortunately, the old computer booted with the program and all its updates as of April 15 of the necessary year. Caveat: I was lucky that the computer booted. Even in mothballs, CDs, hard discs and electronics that have sat on a shelf in the garage or attic can deteriorate fatally. And don’t forget, CD drives are fast becoming dinosaurs.

(3)     Carbonite, and most other backup programs, are usually only for backup, not archival purposes. This means when you delete a file off the source disc, the backup service will delete it from their system as well. (Carbonite will keep files that are no longer on your computer for 30 days and then remove them from their system.)

If you have files that you want to preserve, but may not look at for years, you need to take specific precautions. Some possible options might be:
  • You can keep them on your active hard drive so they continue to be backed up.
  • You can move them to your own offline storage and test them at least annually for accessibility. If you do this, you should replicate them on two different types of media such as CDs and flash drives.
  • Or you could manually copy them to a cloud service that does not sync to a local file, including syncing the fact of deletion. At the moment Microsoft’s and Google’s online storage is free – up to a limit – and can be used without syncing. Remember, though, that even these companies have changed their focus and discontinued services; often with little warning.
  • For the extremely technically competent, some paid backup services can give you detailed control over retention rules. Amazon has such a service that only super geeks are aware of for a pennies per gigabyte per month; but you might have to wait a day or two for them to retrieve your data.

The best solution is probably a combination of more than one of these options. And for the really valuable documents – drafts of your best-seller, masters to your gold record, Howard Hughes’ will naming you – include a classic analog copy: toner or pigment on archival-grade paper. Beware of inexpensive ink-jet printers. Die based inks can fade while pigments used with better photo printers are much longer lived.

To preserve non-text content such as images or sounds for generations without having to revalidate them every couple years, the only option is metal. Photos (still or moving) should be saved as color-separated (not an amateur process) silver on a stable base. Classically this is referred to as “black and white film negatives.” The copper master disc for pressing an LP should be sufficient for audio recordings. This is basically the strategy NASA used when they shot the world’s “Hello”s to the stars.

Unlike my 1990-version PageMaker digital files, all of these analog media should be readily decodable with the basic software built into most advanced terran life. Extracting the audio may be a little more difficult, but even 20th century technology should be able to come up with a way to turn physical squiggles on a disc into the corresponding sound, even without a turntable.

More information: https://en.wikipedia.org/wiki/Media_preservation

Creative Commons License. This work by Bill Barnes is licensed under a Creative Commons BY-NC-SA 3.0 US License. Permissions beyond the scope of this license may be available at http://zaitech.com/satellite/contacts.htm.
(cc) 2014- Bill Barnes - Disclaimer - Home Page - Blogs Home

Pages