Group today unveiled SecretFlow Cloud, its cloud-based cryptographic computing platform, and a suite of cryptographic computing solutions tailored for large language models (LLMs) at the World Artificial Intelligence Conference in Shanghai.
Leveraging integrated software-hardware cryptographic computing technologies, these solutions enable the secure circulation of encrypted data throughout the hosting and inference processes of LLMs, safeguarding the integrity of LLM assets and data security while significantly enhancing data privacy protection. Providers can now encrypt and deploy their LLMs onto cloud environments effortlessly with a single click through its encrypted hosting service, safeguarding their valuable assets from compromise and theft. Meanwhile, the encrypted inference service ensures efficient data security and the protection of trade secrets during interactions with LLMs.
These solutions allow GPUs to perform computing tasks within trusted execution environments, notably reducing the cost and performance disparity between encrypted LLM inference and plaintext operations, providing a more cost-effective approach. The platform also integrates additional technologies such as memory and disk encryption, ensuring end-to-end encryption and secure cross-domain model hosting management. Additionally, a user-friendly remote attestation system allows for smooth verification through web interfaces, minimizing friction for users.
SecretFlow Cloud currently supports both public and private cloud deployments, and is compatible with commonly-used LLMs in the global market. For instance, in a public cloud setup, users can seamlessly build a new professional LLM on SecretFlow Cloud or migrate an existing LLM to the platform, with its secure LLM inference services ready in as few as 10 minutes with just a click. Moreover, users can purchase cryptographic computing resources on demand to meet their needs.
"Data is the most critical element in the application of LLMs, and cryptographic computing technology significantly enhances the utility of data in cross-domain scenarios," noted Dr. Lenx Wei, Vice President and Chief Technology Security Officer at Ant Group. "As LLMs evolve toward specialized professional domains and become essential productivity tools, cryptographic computing will play a vital role, particularly in fully unlocking the value of high-quality, domain-specific data sets."
According to Dr. Wei, while companies often deploy LLMs in private environments to address data security challenges, such methods can incur higher operational costs and compromise both service efficiency and quality, ultimately constraining the potential of LLMs. The cryptographic computing solutions offered by SecretFlow Cloud can effectively resolve these issues.
Looking forward, SecretFlow Cloud aims to further develop its cryptographic computing solutions to ensure data security throughout the full lifecycle of professional LLMs, including their creation, deployment, and service delivery.
Ant Group has been exploring privacy-preserving computing technologies since 2016 and released SecretFlow in 2022 as an open-source privacy-preserving computing framework that features an array of advanced technologies, including multiparty computation, federated learning, Trusted Execution Environments, Homomorphic Encryption, and Differential Privacy. Positioned at the forefront of cryptographic computing, SecretFlow Cloud represents a new generation of privacy-preserving computing technology, poised to overcome pain points in security, cost, and accessibility in data circulation. It offers a solution for instantly harnessing data value, akin to turning on a tap. To date, its technologies have been adopted in industries such as insurance, rural finance, healthcare, public services, and marketing.
Comments