Trend in Information Technology

Get Started. It's Free
or sign up with your email address
Trend in Information Technology by Mind Map: Trend in Information Technology

1. quantum computing

1.1. Quantum computing is an area of computing focused on developing computer technology based on the principles of quantum theory (which explains the behavior of energy and material on the atomic and subatomic levels).

2. virtual reality and augmented reality

2.1. irtual reality is a computer-generated simulation of an alternate world or reality, and is primarily used in 3D movies and in video games. Virtual reality creates simulations—meant to shut out the real world and envelope or “immerse” the viewer—using computers and sensory equipment such as headsets and gloves. Apart from games and entertainment, virtual reality has also long been used in training, education, and science.

2.2. AR is used in apps for smartphones and tablets. AR apps use your phone's camera to show you a view of the real world in front of you, then put a layer of information, including text and/or images, on top of that view.

3. blockchain

3.1. 5G wireless technology is meant to deliver higher multi-Gbps peak data speeds, ultra low latency, more reliability, massive network capacity, increased availability, and a more uniform user experience to more users.

4. 5G

4.1. 5G is the 5th generation mobile network

4.2. It is a new global wireless standard after 1G, 2G, 3G, and 4G networks

4.3. 5G wireless technology is meant to deliver higher multi-Gbps peak data speeds, ultra low latency, more reliability, massive network capacity, increased availability, and a more uniform user experience to more users.

5. artificial intelligence (AI) & machine learning

5.1. Artificial intelligence is the capability of a computer system to mimic human cognitive functions such as learning and problem-solving. Through AI, a computer system uses maths and logic to simulate the reasoning that people use to learn from new information and make decisions.

5.2. Machine learning is an application of AI. It’s the process of using mathematical models of data to help a computer learn without direct instruction. This enables a computer system to continue learning and improving on its own, based on experience.

6. robotic process automation (RPA)

6.1. software technology that makes it easy to build, deploy, and manage software robots that emulate humans actions interacting with digital systems and software. Just like people, software robots can do things like understand what’s on a screen, complete the right keystrokes, navigate systems, identify and extract data, and perform a wide range of defined actions. But software robots can do it faster and more consistently than people, without the need to get up and stretch or take a coffee break.

7. internet if things (IoT)

7.1. system of interrelated computing devices, mechanical and digital machines, objects, animals or people that are provided with unique identifiers (UIDs) and the ability to transfer data over a network without requiring human-to-human or human-to-computer interaction.

8. Cyber security

8.1. Cyber security is the practice of defending computers, servers, mobile devices, electronic systems, networks, and data from malicious attacks. It's also known as information technology security or electronic information security. The term applies in a variety of contexts, from business to mobile computing,