Posted inEmergent Tech

Startup edge/: Dubai’s QX Labs Debuts Ask QX redefines possibilities in AI  

QX Labs’ product Ask QX offers generative AI (GenAI) capabilities that can be integrated into several services and can be used to enhance productivity.

Tilakraj Parmar’s entry into the world of technology stemmed from an intrinsic fascination with its workings. “It all started when we stumbled upon the language models. For example, GPT is one example of a language model, and there are 1000s of them,” Parmar said in a recent conversation with edge/. 

This was eight years ago, from there the groundwork began to what today is QX Labs, where a week ago, the company released its first product – Ask QX. “We believed then that there may be no search engines,” said Parmar.  

What does the product offer? 

Today, QX Lab offers generative AI (GenAI) capabilities that can be integrated into several services and can be used to enhance productivity. It has also recently partnered with Dubai Multi Commodities Centre (DMCC), for the launch of the DMCC AI Centre later this year.  

At the nucleus of its offerings lies Ask QX, boasting a neural architecture meticulously crafted for scalability and cost efficiency. Parmar underscored, “There will be multiple launches and upgrades coming in this year,” emphasising the company’s unwavering commitment to continuous innovation. While leveraging GPT as an open-source language model, QX Labs has seamlessly integrated neural networks and proprietary nodes, optimizing performance while mitigating costs. 

“The biggest advantage this gives us is that it helps us reduce the cost of GPUS, which is where most of the cost and fight is happening in the world,” Parmar remarked, acknowledging the strategic edge amidst industry challenges. 

And rightly so, Sam Altman, OpenAI’s chief is looking to raise trillions of dollars, to overhaul the semiconductor industry. The AI chip shortage can be a roadblock for many companies.  

QX Labs aims to become the first AGI company in the world to access and leverage over 372 billion parameters. 

Reducing the costs  

“From the very initial days we knew that parameters training is going to be the key. When we touched 372 billion parameters, which is roughly 6 trillion tokens, we realised we needed to come up with a new architecture so that it can be cost effective. That is when we focused on neural technology and algorithms,” said Parmar.  

He added this reduced the cost by over 70 per cent. He explains the team invented the algorithms, with zero latency, in node-based architectures. This currently is patent pending technology.  

This in turn makes it more cost efficient to the user. “There will be multiple launches and upgrades coming in this year,” said Parmar. Currently, the behaviour by GPT is better than any other language models. It is built clearly for the AI world. “Our difference come in at the backend and the workings of the algorithms,” explained Parmar. 

GPT is an opensource language model, while QX labs does use 30 per cent GPT, the remaining 70 per cent is based on the neural networks and nodes that the team has inbuilt.  

However, building large language models isn’t easy. Parmar adds while anyone can use a transformer to train the language and the data sets that they have, what QX Labs has done is tweak it in a way to bring in ethical dilemma and factors into play.  

Reducing the margin of error  

These nodes are also blocks and can be put on any chain. Parmar explains this has been experimented with on Polygon as well. One of the biggest advantages of QX Labs that is available in Arabic, Russian, and over 30 Indian languages and dialects is that it doesn’t work around the translation engine. QX Labs trains the language model in the different languages.  

“Currently, it has 0.5 per cent human error, and the important aspect is that it works along the dialects of the different languages as well,” explained Parmar. The idea he adds is to enhance the power to human hands. Currently, the team wants to start with focussing on health and education and move on to industrial use cases as well.  

The team is working around a text to code model, which they want to explore in a bigger way. Parmar adds they still have a long way to go, but the possibilities of AI across industries is vast and growing.