Intel Liftoff: AI Startups
By providing AI companies with direct access to Intel’s resources, Intel Liftoff creates an interactive and creative atmosphere that supports founder success. This initiative, which focuses on useful applications in several industries including financial, cybersecurity, media, e-commerce, and many more, blends Intel’s technological know-how with the innovative ideas of young digital startups.
These inventors’ feature stories demonstrate how Intel’s assistance has accelerated their progress, which in turn has accelerated the advancement of artificial intelligence.
The Media Content Analysis Breakthrough of Mod Tech Labs
The CEO of Mod Tech Labs, Alex Porter, emphasizes how Intel Liftoff has affected the company’s product development. “This opportunity has been a game-changer for us,” Porter states. “With Intel Liftoff’s assistance, they were able to increase the speed by 32x and double our precision for anomaly detection in media content,” the spokesperson said. The relationship made it possible for processing speed and precision to significantly rise, which is necessary for anomaly detection in media material.
According to Porter, “It’s not just about the development support; it’s the strategic guidance and industry connections that have truly set us on a path to success.”
Prediction Guard: Progressing in Reliable Enterprise Apps
Prediction Guard’s Daniel Whitenack emphasizes the significance of Intel’s leadership in performance and security. “Intel leads the way in terms of security and performance,” he says. Because of this partnership, they are now able to concentrate on long-term expansion and scalability, which is essential for corporate applications that use large language models (LLMs).
Moonshot’s Effective LLM Adjustment with Intel’s Assistance
At one of Intel Liftoff’s hackathons, Daniel Han-Chen from Moonshot – Ask the Impossible talks about how they learned how to quickly tune LLMs. “The hack was super useful – learned how to tune LLMs efficiently with LoRA,” he says.
SiteMana’s Innovation in Automated E-Commerce Marketing
As stated by SiteMana’s founder Peter Ma, who also took part in the previous Intel Liftoff LLM hackathon earlier this year: “They at SiteMana were able to build an LLM model inspired by state-of-the-art chatbots with the stellar performance of Intel GPU, Dolly 2.0, and OpenLlama at our disposal during the hackathon.” The deployment and testing process was remarkably smooth, and the model was adjusted to create customized emails. Dolly 2.0 and OpenLlama were used by the SiteMana team during the hackathon to generate a personalized learning module, thanks to the exceptional performance of the Intel GPU. This model exceeded expectations in its smooth deployment and testing, since it specialized in producing tailored emails.”
The Game-Changing AI Assistant for Gamers from Selecton Technologies
Co-founder of Selecton Yevgen Lopatin described their experience at the LLM Hackathon sprint as follows: “We were able to validate the solution with the valuable GPU resources that the sprint offered.” They used an Intel Data Center MAX 1100 GPU with 48 GB VRAM and the LORA training script to optimize the Dolly LLM model over the course of a single training day. They achieved remarkable results by venturing into unexplored computational capability thanks to the LLM platform’s availability.
The Deep Learning Advancements of Terrain Analytics
Terrain can now effectively grow Deep Learning/Language Learning Models (DL/LLMs) without running into computational limits thanks to Intel technology. “Both of the models he made had better success metrics than the ones he built with OpenAI’s Ada model, and were 15x faster to run,” states Nathan Berkley, data scientist at Terrain Analytics. It will be feasible to accurately categorize job titles that are now unclassified once these models are refined. After that, Terrain will be able to map these to its other datasets and provide the functionality that its consumers have been requesting.
Beewant: thriving inside the ecosystem of Intel
The importance of being a member of Intel’s ecosystem was highlighted by Beewant CEO Ahmed Joudad, who said that it had aided in the quick development of their product. “The chance to network with other entrepreneurs was one of the finest features of the Intel Liftoff program. They gained from being a member of the whole ecosystem, in addition to the hardware and the team’s expertise. The ecosystem’s excellent functioning has played a significant role in their ability to create the Beewant solution so rapidly.”
Argilla’s Strategic Growth Under Professional Guidance
Argilla’s David Berenstein agrees that the program has helped them improve their training procedures and match their development plans with industry standards. For Argilla, enrolling in the program was a crucial choice. They have gained a great deal over the last six months from the knowledgeable advice and industry insights given by Intel’s seasoned personnel. Regular office hours and meetings, together with the program’s hands-on approach, have proven crucial in improving the training procedures and coordinating the development plans with industry best practices.”
Honoring Achievement at Intel Innovation 2023
Prediction Guard creator Daniel Whitenack offered his impressions of the event’s significance, saying, “What a fantastic occasion! First of all, via the Intel Liftoff program, they had the opportunity to network with other cutting-edge AI businesses and the Intel engineers who assisted us with launching the product on Intel Habana Gaudi2. Building those contacts further for potential future cooperation was fantastic. Additionally, they saw see Prediction Guard operating on Gaudi2 in the Intel Developer Cloud demo. We were able to demonstrate a new cloud service as the first product during the CTO’s speech thanks to Intel’s Liftoff startup program. Something he will always remember!”