Powered by a knowledge graph with deep contextual information about data and controls from diverse data systems and apps, Gencore AI accelerates enterprise-grade generative AI development
Securiti, the pioneer in data security, privacy, governance and compliance, today announced the release of Gencore AI, a first of its kind holistic solution to easily and quickly build safe, enterprise-grade generative artificial intelligence (GenAI) systems, copilots and AI agents. This new solution accelerates GenAI adoption in the enterprise by making it easy to build unstructured and structured data + AI pipelines utilizing proprietary enterprise data across hundreds of diverse data systems and applications.
Also Read: AiThority Interview with Jie Yang, Co-founder and CTO of Cybever
GenAI is set to drive significant productivity gains, leading to massive economic growth. For enterprise organizations, the biggest barrier to deploying GenAI systems at scale is safely connecting to data systems while ensuring proper controls and governance throughout the AI pipeline. Since the majority of an organization’s data is unstructured data, it’s critical to properly govern and control these assets as they are tapped to fuel AI.
Gencore AI addresses these challenges by enabling organizations to safely connect to hundreds of data systems while preserving data controls and governance as data flows into modern GenAI systems. It is powered by a unique knowledge graph that maintains granular contextual insights about data and AI systems. Gencore AI provides robust controls throughout the AI system to align with corporate policies and entitlements, safeguard against malicious attacks and protect sensitive data.
Today, Securiti also announced that it has integrated NVIDIA NIM microservices into Gencore AI. Integrated in the new solution is a breadth of NIM microservices for the latest AI foundation models, enabling organizations to easily choose the most appropriate NIM microservice for their proprietary data.
Gencore AI automatically learns data controls (like entitlements) in underlying systems and applies them at the AI usage layer, protects AI systems against malicious use, and provides full provenance of the entire AI system for comprehensive monitoring and controls. Gencore AI also provides the flexibility to choose from a rich library of large language models (LLMs) and vector databases to optimize business outcomes.
Also Read: GenAI Powered Copilots: How They Work
Organizations can use Gencore AI to quickly and easily build end-to-end safe AI systems, or to provide key building blocks of GenAI projects. Key capabilities include:
- Build Safe Enterprise AI Copilots: Leveraging the rich library of connectors and unique knowledge graph, build enterprise AI copilots, knowledge systems and apps that combine data from multiple systems, in minutes. Enterprise controls, like entitlements in data systems, are automatically learned and applied at the AI usage layer. Gain full provenance of the entire AI system, including data and AI usage — down to the level of each file, every user, and all AI models and usage endpoints.
- Safely Sync Data to Vector Databases: Quickly and securely ingest and sync unstructured and structured data at scale from any system, including SaaS, IaaS, private clouds, and data lakes and warehouses. Turn data into custom embeddings, while retaining associated metadata and load them to your chosen vector database, making enterprise data ready for LLM usage.
- Curate and Sanitize Unstructured Data for Model Training: Easily assemble, curate, cleanse and sanitize high quality data sets for AI model training and tuning.
- Protect AI Interactions: The natural language conversation-aware LLM Firewall, protects user prompts, responses and data retrievals in AI systems. The LLM Firewall helps moderate AI interaction to align with enterprise policies, protect against sensitive data leaks, and prevent attacks like prompt injections and jailbreaking instructions.
[To share your insights with us as part of editorial or sponsored content, please write to psen@itechseries.com]