A short Digital Glossary by Apptitude.

Axel Pasqualini

Axel Pasqualini Partner · Director April 2024

Every day in our business, we hear and employ many words that are far or near related to market demands and trends in the development of digital products.

We also witness the fact that these same words tend to become material for creating contents of all kinds on social networks and more traditional media. It is extremely interesting for us to see the conversation about innovation gradually democratize itself, but are these topical words really being used wisely?

Here is a short digital glossary of “Buzzwords” (in a positive sense) by our team and how we perceive them at Apptitude :

 

Machine Learning ??

Machine learning is the kind of buzzword everyone is talking about. In simple terms, machine learning is a way to give the computers the ability to learn without being explicitly programmed to do a specific task by a developer. There are mostly 2 types of processes that make this possible:

Supervised machine learning
The algorithms learn to map a given input to an output (here the output could be footage of driving a vehicle used then to teach a computer to drive).

Unsupervised machine learning
Where the algorithms learn by themselves (like how they are used to create customer segments based on purchase history).

The applications of machine learning are vast and there are new ways to use them being found every day. They range from simple use cases like predicting the price of a used car, to more complex ones such as identifying fraudulent online transactions.

Sadly, the word itself it’s currently being used for everything and what not. For example, on the radio, I sometimes hear that the birthday’s of the day lists on Facebook are generated with machine learning. ??‍♂️It is not the case… Also, despite heavy advertising about machine learning, some online tools boasting to help you optimize your website, really only use methods like A/B testing and others, which are more of statistical testing methods rather than proper machine learning.

Mauro Santos
Developer – Full Stack

 

AI – Artificial Intelligence ??

The term “Artificial Intelligence” has been thrown around a lot to describe various processes and algorithms. It is indeed hard to place a limit defining what *is* AI and what *isn’t* AI. In an attempt to better distinguish what is better called AI from what is not, we should try to look at these two words separately. It would make sense that an “Artificial Intelligence” presents both traits of being an Intelligence and being Artificial, however, the term “AI” is a term in itself that has been out for some time now: Its own definition has fluctuated as the technologies around its core concepts gradually evolved.

Even though its popularity only raised recently, the term was born in the fifties already. One of the pioneers of AI tells us what it was originally about, in 1955:

“The exciting new effort to make computers think… machines with minds, in the full and literal sense.”

Source: chartsofhumanity.com

Nowadays, dictionaries define it more precisely, but still gravitate around the same concepts:

“[…] the ability of a digital computer or computer-controlled robot to perform tasks commonly associated with intelligent beings.”

Source: Encyclopædia Britannica 

Given that these two definitions have been written far apart in time, I think it’s safe to say that AI is all about machines doing work that usually require human intelligence, more than mathematical models. The definition is still broad, but at least we now know that it is related to human capabilities.

Understanding and being able to tell whether a face is happy or sad is something humans are good at, so if a machine does it, that probably qualifies as “Artificial Intelligence”. Sorting a list of numbers, however, is probably not.

Kewin Dousse
Developer – Front-End

 

Digitalization ? ?

There is some debate here and there about the usage of the word “digitalization”, particularly in French-speaking regions where “Numérisation” is considered a better term. For French purists, friends of Larousse and other academics, “Digitalization” is an unacceptable Anglicism, you MUST use “Numérisation!” – For me, it’s not that simple…

Originally, the word “digit” comes from the Latin “digitus (-tum)” and expresses a clear link with the usage of fingers; the fingering for musicians, the fingerprint in NCIS and many others examples that I’ll let you imagine…

In English, digital also speaks of the finger, but “digits” also means “number”. We are all familiar with that 10 character alphabet (0 to 9) that allows us to form other numbers… Maybe this is why we first learn to count on our fingers?! Isn’t it odd that we’re being told about 0 long after the other 9 numbers?

If we move forward in time (since the time when Latin was spoken), we find in our history this famous Industrial Revolution, right in the middle of the 18th century, which gave birth to the automation of hard human labor: the creation of the first engine, the steam engine. A century later, the 2nd Industrial Revolution would open up new paths with the arrival of electricity and oil.

The next century (20th) marks then the arrival of Turing, electronics, computers, etc… Men began to automate and computerize all kinds of tasks, processes, and documents during this century. Then finally, the Internet and its democratization was born, which I would place in the year 2000 to simplify.

It is from this point, in my humble opinion, that we can distinguish computerization (Numérisation in French) from Digitization, beyond its own etymology.

In the 20th century, we were computerizing…
In the 21st century, we are digitizing!

With the advent of the Internet in society (no longer only reserved for a few geeks), the hyper-penetration of the Smartphone (more than one per person), the phenomena of social networks, and Wi-Fi everywhere ; it is a profound change in our social and economic model that we are witnessing… without borders, nor generation or sex. In my opinion, Digitalization in this context should be legitimized as its own unique word in Le Larousse.

Axel Pasqualini
Partner – Director

 

Blockchain ⛓⛏

Make no mistake about it, the blockchain is indeed a very powerful invention. However, some limitations of the operating system currently reduce its usefulness to a few very specific cases, and contrary to popular belief, it would be totally counterproductive to want to apply it to everything in the context of normal businesses, primarily due to the following factors:

Content
The interest for the blockchain lies in the impossibility of modifying its content once it has been written while preserving the ability to incorporate additional content by different independent actors.

If there were only one entity interested in adding content to a blockchain, it would be much simpler and less energy consuming to simply publish this data and digitally sign it with that entity’s cryptographic key.

Cost and attractiveness
The impossibility of modifying the content of a blockchain is guaranteed by the computing power invested to “mine” (i.e. solve a difficult mathematical problem using a computer) on this chain.

It is therefore essential that enough players are interested in mining on a particular blockchain so that the cost of the computing power required to modify the content of the chain is higher than the potential gain obtained by said modification.

Verification
The content of a blockchain is public. It is possible to encrypt your data before publishing it in the blockchain, but it will probably be difficult to find people willing to mine a chain whose content they can’t verify.

This is why the blockchain is particularly well suited to the implementation of crypto-currencies, where each transaction must be public and irreversible. It also makes possible “smart contracts”, programs published in the blockchain that are executed when a certain condition is met (for example, when someone has paid a sum in crypto-currency to a given address). Since the program is not modifiable and executed by all participants in the chain-mining, the buyer is guaranteed that the programmed action will be executed correctly.

With strong security guarantees and inherent transparency in the system, perhaps even state crypto-currencies will one day appear to fight money laundering and tax fraud.

Matthieu Girod
Developer – Full Stack & Mobile

 

Augmented Reality, Virtual Reality ??

There is often confusion between virtual reality (VR) and augmented reality (AR) and their scope. Previously seen as “gadget” technologies limited to the world of entertainment, their use in industrial, engineering, and research environments are gradually becoming commonplace with a particular focus on the many possibilities they offer in terms of spatial visualization in particular.

As such, these terms could be differentiated as follows:

VR = Virtual reality
Accessible mainly through a VR headset. It immerses the user in entirely virtual experiences and environments with a variety of possible interactions.

AR = Augmented reality
Accessible through a terminal, a screen or with smart glasses. It allows the user to visualize virtual experiences and data juxtaposed to the real world via this terminal. This technology is very often mistaken for hologram renderings and Pepper’s Phantom techniques, especially in the media.

MR = Mixed reality
Accessible mainly via a VR helmet but which this time allows the perception of the real environment, thus creating an experience that switches between AR and VR.

With the evolution of connected objects and the development of new types of terminals, these technologies have great days ahead and will tend to become more widespread, particularly in public spaces, advertising / information media and visual merchandising (display of information in stores, high-tech showcases…).

If you wish to know more about these virtual experiences, Aniwaa published an excellent guide on this topic that you can read here.

Michael Vuilleumier
Designer – User Experience & Interface