Tech Talk: What the !#%^ does that mean?

, ,

Not knowing tech talk is so Y2K.

Digital literacy isn’t just reserved for people in the tech industry anymore. If your tech talk abilities are sub-par, chances are you’re stuck nodding along during conversations at work or even in your social circle.

Tech Talk: Why Learn It?

While “technology is changing our lifestyle” is a common phrase, it is more accurate to say that technology is our lifestyle. All of our personal records are online, we use technology for work and education, and we communicate everyday via social platforms.

Digital literacy is beneficial for everyone: it helps us to understand the technology that we use every day and how to use it appropriately, it helps us to understand our privacy and security, and it helps understand the technology of the future!

So, here’s the 411 on all the tech talk buzzwords and trends you will continue to see in the news.

Data

Data is a broad term that we hear every day. If you haven’t noticed, data is the backbone of all tech-related talk in the news. In the simplest of terms, data is the information and figures that are stored in a computer. Businesses can collect, store, and analyze this data to improve performance and to drive business. Here are some additional terms that data encompasses:

Graph Databases

A graph database is a non-relational database suitable for very large and complex sets of distributed data. Visually, a graph database is like a web of data. Contrarily, a relational database resembles a table of data. This interconnectivity of the data is useful for analyzing relationships and discovering patterns. Graph databases are often used in data mining for social media data, or in supply chain management to uncover the connections between each customer’s online actions.       

Cassandra

Cassandra is an open-sourced, distributed database management system designed to handle large amounts of data across multiple servers. Since Cassandra uses Graph technology, the database allows the data to interact, rather than being isolated in exclusive data silos. At CTO Boost, we’ve helped dozens of companies enjoy the benefits of Cassandra, without having to spend their own time and resources managing their data infrastructure.

Cloud Computing

Cloud computing refers to when data is not stored on your computer, but instead in servers accessible via the internet. Examples include Gmail and Google Docs, as you are viewing data stored externally in the “cloud” on your device.

Data Mining

Data mining is the process of examining mass amounts of data to find consumer patterns and behaviors that are useful for digitally marketing goods and services.  

Big Data

Big data can be defined as collections of data – from any source – that are so large that they cannot be processed through traditional data processing systems. By finding ways to sort this data, companies can better identify consumer patterns and use them to optimize their business.

Dark Data

Dark data refers to the unstructured data found in an organization’s database that has not been processed or analyzed, such as financial statements and customer information. According to the International Data Corporation, 90% of organizations’ informational assets are stored but never analyzed. If companies analyze the mass amounts of dark data in their databases, they can find many valuable insights to drive business.

Tech Trends

While tech has made its way into mainstream media, it is seldom explained. We hear about VR, Blockchain and AI with little context and understanding of what it actually is. So without diving into extensive computer science and IT, read up on these tech trends to help you understand tech in the news.

AI (Artificial intelligence)

Artificial Intelligence technologies are computer systems built to mimic human intelligence and perform tasks faster and more accurately than humans, examples being Siri and Google Maps.

Cognitive technologies (products of AI) are becoming increasingly self-learning and intuitive. AI works by processing a large amount of data very quickly. Through algorithms, the software automatically learns patterns and features in the data that can be used to solve problems, generate insights, and virtually anything!

Machine Learning

As a subcategory of AI, machine learning is a method of data analysis that discovers patterns and gains insight from data with minimal human interaction. ML apps are used for data analysis, data mining and pattern recognition. Machine Learning is excepted to grow to 8.81 Billion by 2020.

RPA (Robotic Process Automation)

Robotic Process Automation is simply technology that is replacing human jobs with machines. RPA completes a task by using software to interpret applications, process transactions, and send a response. Experts estimate that 5% of jobs on earth can be fully automated, while 60% can be at least partially automated.

Blockchain  

Blockchain is a decentralized, public database shared across many networks, commonly recognized as the record-keeping technology for cryptocurrencies like bitcoin. Once a transaction is made (a money transfer, a purchase etc…) there is a record, and this record is then added to a block (a bundle of transactions), this block is linked to the previous block, creating a chain of blocks.

Once a record is added, it is very difficult to change or delete it because each record has its own data, and the data from the previous record. In order to change a single block on a blockchain, you would have to continuously change every block that comes after it. Therefore, blockchain is essentially hacker-proof and can assure that transactions are authentic.

Edge Computing

As the amount of data that companies deal with increases, there is a need for data processing that exceeds cloud computing potentials. Edge computing is when data sits “on the edge” of cyberspace, closer to where data processing happens versus when it is stored in the cloud. Edge computing is a solution to the limitations of cloud because it can be used to processes time-sensitive data, even if it has limited connection to the centralized data location. As daily life and business become increasingly digitalized, edge computing will increase.

Virtual Reality

Virtual reality is the use of computer technology to simulate an environment. It is a user interface that places the user inside the experience by simulating 3D vision, hearing, touch and sometimes even smell. Augmented reality is similar, but instead of immersing the user in a simulation, VR simulates computer-generated things in the user’s real environment.

VR is made possible through hardware headsets, covering the user’s eyes and ears. The screen on the inside of the headset mimics the position of the user’s eyes, and the graphics react accordingly though their movement. A popular use of the VR headset is for video games, as users feel like they are inside the game.

VR and AR have endless possibilities and have been used for military training, education, theme parks/movie theatres, and even to help surgeons train for surgery.

Cybersecurity

Cyber Security is the practice of defending computers’ servers, devices, networks and data from malicious hackers. Cybersecurity is constantly evolving to protect people’s personal, and corporate data from hackers trying to illegally access it. As IoT increases and more and more information are digitalized, there is an ever-growing need to secure this data.

Internet of Things

The internet of things (IoT) is a very broad umbrella term that essentially describes the action of things being connected to each other and exchange data via the internet. An example would be Fitbit, a consumer item that while you wear it on your wrist, the watch uploads your health data to an app on your phone. Another example is any items that are related to “smart homes” such as apps on your phone that allow you to turn on/off appliances in your house, even when you are not home.