Last week, Questions to Carriers started to explain the constantly changing scope of mobile networks within the US. We broke down the difference between CDMA and GSM network infrastructures, which has been a hurdle for consumers in terms of devices and carrier switching. But now manufacturers are creating phones that function on both GSM and CDMA networks. Meaning, consumers can focus less on infrastructure compatibility and more on what we really care about: data speed.
So today Question to Carriers will dive into data speeds and a different set of confusing acronyms, i.e. 3G, 4G, LTE and even the future 5G.
But before jumping into the more commonly known data speeds, let’s take a look back at 3G’s predecessors.
Before the entrance of 3G
The majority of mobile internet users today probably can barely recall the days prior to 3G technology. But of course there was a time when downloading an email attachment felt like it took an hour. Can you imagine trying to live Tweet or stream a song without at least 3G speeds?! Ahhh!!
Before the wide adoption of smartphones (which basically gave consumers a reason to want faster mobile data speeds) there was the first generation (1G) of mobile communications. The 1G system was based off of Bell Systems’ “Improved Mobile Telephone Service” (IMTS) and was developed in the late 1960s and adopted in the early 1970s. In the following decade, 2G development began, which was the first digital cellular system, allowing mobile network operators to provide better quality service at a lower cost.
Before smartphones the demand for fast data didn’t really exist. We weren’t uploading to images to Instagram or tweeting on-the-go. And voice calls and SMS functioned well-enough over 2G technology. The International Telecommunications Union (ITU) oversaw much of the development of 1G and 2G technologies. But standards for data transmission technologies were not set until the advent of the third generation (3G) network infrastructure.
When talk of a third generation of mobile communication technology began in the late 1990s, the ITU decided standards should be set for global adoption. The ITU set standards for 3G speed and capacity under “IMT-2000” after organizing an agreement between the cellular industry and global leaders in 2000.
The emerging 3G technologies
The first standards set by the IMT-2000 (International Mobile Telecommunications) set an expectation of progression with each new generation. That progression included faster data transmission speeds, more network capacity and better connection quality, in terms of service and devices. But it did not set forth how mobile network operators (MNOs) had to reach those standards.
3G data transmission standards are set to a minimum rate of 2Mbit/s; a big advancement from the typical 2G speeds ranging from 9.6 kbit/s to 28.8 kbit/s.
In the early 2000s Americans saw the entrance of 3G technology into mainstream, GSM and CDMA networks found different ways to provide the required data transmission rate. Those different techniques came with a bunch of new acronyms to remember. Consumers may recall EV-DO, EDGE, HSPA.
These 3G advancements came in two forms (pdf):
Evolutionary: The process of a MNO modifying its existing 2G spectrum to meet 3G speed requirements. Through backwards-compatibility, MNOs that used their existing spectrum had 2G and 3G users on the same radio frequencies.
Revolutionary: MNOs licensed more spectrum with different radio frequency bands that they could use alongside their existing frequencies. These providers would thus create a dual mobile network with different frequency bands for 2G and 3G.
When discussion of 4G technology began, spectrum issuing authorities such as the FCC gave MNOs both options. Specific new frequency bands were designated for 4G and potential new technologies. Large MNOs (that had enough capital) typically opted to purchase more spectrum to grow their network, while most smaller operations were forced to evolve their network to meet the standards.
The ITU set goals (under the label IMT-Advanced) of 100 Mbit/s transmission speeds(pdf) to be considered 4G. This was another drastic improvement from previous generations (remember 3G was 2Mbit/s). This was great for consumers. But for MNOs this came with a cost. They had to either modify their current 3G network infrastructure or buy more spectrum once again.
To meet the goal of 100Mbit/s, MNOs began implementing new network technologies called LTE and WiMax. When these technologies emerged they were considered the fastest data transmission systems to date.
But did they meet the 4G data speed standard?
As carriers attempted to upgrade their networks, they realized while doing so that 100 Mbit/s was harder to achieve than they originally thought. Without an enforcing authority, mobile network operators began marketing 4G speeds when they felt significant speed increases were met, even if they did not meet the 100 Mbit/s standard.
In 2010, the ITU acknowledged that the major speed increase set by 4G standards was aggressive. Ultimately, the ITU decided carriers could call their network “4G” if there was a “substantial level of improvement in performance and capabilities with respect to the initial third generation systems now deployed.”
4G speeds of 100Mbit/s were not actually met by many carriers until 2013, with the entrance of LTE-Advanced.
The Real 4G Network (often labeled as 4G LTE)
What is widely considered “real 4G”, in fact, meets the original standards for 4G and comes from advancements in LTE and WiMax technologies. Dubbed LTE-Advanced and WiMax 2.0 (both considered 4G by ITU under IMT-Advanced standards), these two technologies competed to become the primary technology used by mobile network operators to achieve 4G capabilities.
In late 2012, WiMax accepted “defeat” in terms of MNO adoption and began setting up the requirements to integrate with LTE-Advanced. Today, all US carriers have implemented some version of LTE technology.
Now that MNOs have finally met the requirements of 4G with LTE-Advanced consumers should actually experience 100 Mbit/s transmission speeds. Yay! But since we are finally there, it seems it is time to talk about the future once again... meaning 5G.
The [ITU just recently established a roadmap to reach 5G technology by 2020], named IMT-2020. The first US mobile provider claiming to get there is Verizon, with recent reports of “some level of commercial deployment by 2017”.
So what speeds will consumers hit on a 5G network? According to reports the goal is 20Gbit/s, although the ITU has not set an official public standard.
20Gbit/s would be the fastest Internet ever seen (faster than most home broadband networks). Google Fiber, known as the fastest Internet in the US, has a current speed is 1Gbit/s, meaning 5G would be 20x faster.
The claim comes with more questions for end consumers though. LTE-Advanced has opened an opportunity for MNOs to actually achieve the data transmission speeds consumers demand. But does the implementation of LTE make CDMA and GSM irrelevant?
The answer is yes and no. The implementation of LTE means compatible devices come with a SIM card (LTE technology references a SIM rather than a MEID), even if the device is tied to the CDMA network. So while your device may typically transmit data via LTE, if you hit a LTE “dead zone” your device will fall back to its traditional network infrastructure, meaning CDMA or GSM.
The entrance of VoLTE (or voice calling over LTE data transmission) is helping make CDMA and GSM less relevant. But until carriers have universal LTE coverage, you will still need older network infrastructure fallbacks. A future where all mobile phone functions (voice, messaging and internet) transmit through LTE is exciting, mostly because that means devices should work on any network.
Sadly, that still may not be the case.
So, next week Questions to Carriers will break down how carriers still can impact phone functionality through LTE bands. Widely adopted VoLTE may be on the horizon, but will falling back to traditional voice network infrastructures still trap your phone to a specific network infrastructure?