Gcore is pleased to announce the availability of Pienso on the Gcore Cloud Platform powered by Graphcore IPUs. Pienso is a deep learning natural language processing (NLP) software company that makes it easy for domain experts to use Large Language Models to build, train and manage AI models aligned with their organization’s own data. Because Pienso is no-code/low-code, subject matter experts can shape models without ever seeing a line of code.
Pienso will utilize built-for-AI Graphcore IPU technology on the Gcore Cloud platform to offer the fastest and most cost-effective AI solutions to its end customers. Graphcore IPU technology is renowned for its ability to accelerate AI workloads, shrinking the time and cost to insights. In partnering with Gcore, Pienso will leverage Graphcore’s high-performance AI accelerators abstracting the complexities of MLOps to provide a turnkey solution. This provides end customers a cost-effective choice for how and where they build and deploy interactive AI models.
MIT-Born Interactive AI Technology powered by Graphcore IPUs
“We are delighted to partner with Pienso to provide Graphcore IPU technology-as-a-service on the Gcore Cloud platform,” said Andre Reitenbach, CEO of Gcore. “Together, we are equipping public and private sector organizations to deliver value from their AI and make a real impact on their operations.”
Built to harness the power of Large Language Models—without risk of vendor lock-in or exorbitant egress charges, this collaboration positions Pienso to meet the growing demand for more accessible AI solutions across a wide range of industries, particularly in industries and in regions that prize data sovereignty.
“We are pleased to partner with Gcore to leverage their expertise in cloud computing that brings the power of Graphcore IPUs to enterprise AI customers-as-a-service. For companies and government entities with data privacy concerns, the Pienso solution is an ideal alternative to Large Language Models accessed via a CSP-hosted API,” said Birago Jones, CEO of Pienso. “Our code-free interface, coupled with Graphcore IPUs which are designed for AI workloads, delivers a user experience that encourages experimentation. Fast throughput, low-latency, interactive AI delivers insights in time to act on them. The self-service portal and on-demand pricing Gcore offers make it simple for users to get started quickly.”
Cloud-Native Privacy
Pienso’s service is available running on Graphcore IPUs on Gcore Cloud Platform, using datacenters situated in mainland Europe – a requirement for customers who need to ensure data privacy and sovereignty.
Gcore’s reputation for delivering ultra-low-latency services made them the ideal choice to support Pienso’s service. The combination of the performance boost enabled by IPUs and low cloud latency offered by Gcore, allows Pienso to service the growing number of customers requiring near real-time insights, such as customer contact centers looking to monitor emerging problems and potential opportunities among large volumes of inbound communication.
To try Pienso on Gcore Cloud Platform running on Graphcore IPUs please contact sales@gcore.com.