• Generative AI

    Generative AI is here:
    Are you ready?

    Intel® Innovation Built-in
      • Where to start

        A new Generative AI age

        Generative AI (GenAI) is being leveraged on a massive scale by organizations and individuals, causing a significant societal impact. Consumer-grade AI, like ChatGPT and DALL-E, has captured everyone's imagination with its ability to generate content. Yet, GenAI's impact on organizations promises even more value, including heightened productivity, cost reduction, and a transformation in how we work.

      • 76%

        of IT leaders believe GenAI will be significant if not transformative for their organization.

        GenAI will transform organizations

        GenAI brings rewards, but it also comes with new challenges and risks. As organizations embark on their GenAI journey, they cannot risk customer trust and the high value of their data for the reward of being first to the finish line.

        76%

        of IT leaders believe GenAI will be significant if not transformative for their organization.
      • The right data for your use case

        To effectively leverage the power of GenAI, organizations must strategically utilize rich, high-quality data. The best use cases rely on this data and require a balanced blend of skill sets, budgets, and resources, highlighting the importance of collaboration between business and IT teams to establish priorities.

        • Content Creation

        • Natural Language Search

        • Code Generation

        • Digital Assistant

        • Design & Data Creation

        • Document Automation

      • Achieve Generative AI success

        While enterprises have unearthed hundreds of use cases across every vertical, landing on the right use cases is crucial.


      • People and teams

        Prepare your organization to address the GenAI opportunity, with IT organizations focusing inward and the business looking outward.

      • Processes and policies

        Define and communicate how your organization will leverage AI and make it a critical aspect of your business to engage employees.

      • Technology

        Deliver secure access to GenAI across your organization, avoiding Shadow AI instances to ensure data integrity and compliance.

      • Strategy

        Capture your environment's “as-is” current state to determine the strategic vision and guiding principles for future GenAI projects.

      • The vital role of your data in Generative AI

        Data and risk go hand-in-hand. Data will drive your GenAI projects forward, but you also need to assess the potential risks of hosting GenAI models in public clouds, including: intellectual property loss, data leakage, privacy issues, compliance violations, credibility and integrity loss, bias, and IP infringement.

        • Managing risk and enhancing value

          As you begin your journey, it is essential to align investments in technology and training to boost operational maturity, reduce risk, enhance control, and maximize value for your organization. With enterprise-ready GenAI, you gain control over who can access your data.

      • Graph showing that Risk decreases over time as you reach Operational Maturity with your data handling.
        Graph showing that Risk decreases as your organization becomes more Operationally Mature. Risk is highest with Immature GenAI data handling and decreases as you move to Enterprise-ready GenAI.
      • USE AI WITH YOUR DATA

        Keep Generative AI models close to your data

        • Understanding the risks and benefits of different deployment options is crucial in determining your organization’s optimal workload placement for GenAI. When it comes to bringing AI to your data, deploying private instances of GenAI large language models (LLMs), such as Llama 2 or Falcon, offer advantages in speed and deployment, but they may also involve higher costs and other drawbacks. Either way, in-house GenAI will likely provide the most value for your early efforts.

          In terms of workload placement, GenAI is no different than any other workload. To get the best outcomes, put it in the environment that makes the most sense based on your business requirements and technical needs.

          The diagram below conveys concepts and frameworks that come into play when determining GenAI workload placement.

      • A series of horizontal sliders representing the risks and benefits of storing your data in the Private Cloud versus the Public Cloud.
        A chart representing six factors to consider when choosing between Private Cloud or Public Cloud for GenAI workload placement. ‘Where data resides’ and ‘Secure access’ lean greatly toward Private Cloud. ‘Cost’ leans moderately toward Private Cloud and ‘Accuracy and customization’ leans slightly to Private Cloud. ‘Faster time to value’ leans slightly to Public Cloud. And ‘General purpose use cases’ leans greatly to Public Cloud.
        • Data Management for Generative AI

          Most organizations are taking a two-pronged approach to their GenAI strategy. They're experimenting with tactical deployments to learn and avoid falling behind, while also developing a long-term strategy to accommodate the many use cases that will emerge over time. This approach requires a two-tiered data management strategy.

        DATA PREPARATION
        DATA ENGINEERING
        • Data Preparation

        • Data discovery

          Identify data sets and define data requirements

        • Data exploration and enrichment

          Design and implement data pipeline to tag, cleanse, label and anonymize data


        • Short-term: Data Preparation

          Data preparation includes identifying data sets and defining data requirements followed by cleansing, labeling, and anonymizing the data, then normalizing it across data sources. It also requires building data pipelines to integrate the data into a model.

        • Data Engineering

        • Data ingestion

          Integrate enterprise data into Large Language Models

        • Observability and performance

          Verify that transformed data meets objectives


        • Long-term: Data Engineering

          Organizations need a well-structured data repository, such as a data lake or data lakehouse, to integrate their data with GenAI models. Consider building the data lake iteratively to progressively expand the capabilities of the GenAI data repository while the team enhances their data management and GenAI skills.


        • “This collaboration [with Dell Technologies] will empower companies to build their own AI systems leveraging the incredible innovation of the open source community while benefiting from the security, compliance, and performance of Dell systems.”

          Jeff Boudier, Head of Product and Growth, Hugging Face
      • RIGHT-SIZE AI

        Define infrastructure and right-size AI

      • 75%

        Dell AI solutions for inferencing LLMs on-premises can be up to 75% more cost effective than the public cloud.1

        Your unique data enables you to utilize domain- and enterprise-specific use cases, creating industry value through tasks or functions for which you have exclusive ownership of the data. Different types of GenAI have corresponding entry points and investments that are necessary to ensure success. LLMs trained on vast amounts of text are like encyclopedias, helpful for general use, but may not be suitable for answering specific questions regarding your organizational data.

        75%

        Dell AI solutions for inferencing LLMs on-premises can be up to 75% more cost effective than the public cloud.1
      •  


         
      • Your data greatly improves the efficiency and value of GenAI

      • Your data can greatly improve the efficiency and value of GenAI. There are three types of GenAI to consider: Large Language Models, Domain-specific Models, and Enterprise-specific models.
        Graphic showing relative amounts of data required for 3 types of AI models, as well as their business value. Large Language Models, or LLMs, are for general purpose use cases and use the most data. They can be costly and energy-intensive and they are more prone to hallucinations. Domain-specific AI uses less, but more specific data. It has limited functionality, but is more relevant to your business and has more value. Enterprise-specific AI uses still less data, but is the most specific and accurate, and offers the greatest value to your business.
      • AI deployment models: Assessing cost and value tradeoffs

        The first three types of deployment models shown below are what most organizations are currently implementing, starting with "model augmentation" then ultimately deciding on "fine-tuning models." The AI model you choose will depend on your organization’s level of data science readiness, deployment patterns, and the implications of each.

      • Pre-trained model

        Referred to as 'prompt engineering,' this approach involves posing a question to a pre-trained model and receiving a result.
        Example: ChatGPT

        Model augmentation

        Enhance your GenAI model by adding your data to provide additional context for its answers, such as inferencing, which includes use cases like Retrieval-augmented generation (RAG).

        Fine-tuning models

        This involves adjusting model weighting and incorporating your data. While it leads to improved results, it also demands more effort during setup.

        Model training

        It includes building a specific model and training it with a data set. This typically requires the most work and resources and is often reserved for solving complex problems.

        Effort Small Effort Medium Effort High Effort Significant Effort
        Cost Low Cost Medium Cost High Cost Significant Cost
        Value &
        differentiation
        Minimal Value & Differentiation Medium Value & Differentiation High Value & Differentiation Significant Value & Differentiation
        Data integration No Data Integration High Data Integration High Data Integration Significant Data Integration
        Infrastructure Client – server Client – server GPU optimized Large GPU deployment
        Skills IT Ops Developer Data scientist(s) Data scientist(s)
      • Choose the right infrastructure for your model

        The infrastructure supporting your GenAI deployment depends largely on computational requirements, influenced by model type, model size, and number of users. Additional considerations include the necessary storage capacity for data used during deployment, training, and model refinement.

      • Chart representing three GenAI requirements: Model Training, Number of Parameters, and Number of Users, and mapping them to the appropriate Dell hardware solutions.
        Chart representing three GenAI requirements and mapping them to the appropriate Dell hardware solutions. The hardware solutions range in power from General Compute options, which are CPU-oriented, up to AI-optimized options, which are GPU-intensive. The specific options start with Dell Laptops on the General Compute end, and progresses through Precision Workstations and PowerEdge Servers, ending with PowerEdge XE Servers on the AI-optimized solutions end. Note that Dell storage and networking hardware can be used across the entire range.Three GenAI infrastructure attributes are mapped to these solutions in a progression that requires more and more processing power. Complexity of Model can range from using Pre-trained Models and Augmenting or Fine-tuning Models to training new Models. Number of Parameters can range from millions to billions, all the way up to trillions.
      • Accelerate your ai journey

        Start with an early win

        Retrieval-augmented generation (RAG) is an ideal early use case for many organizations that tap additional resources, such as your own data, to augment a model without retraining it. Explore the setup of RAG use cases that can be applied to enhance your business and data.

      • RAG Use Case

        Apply RAG to a custom PDF dataset

      • DELL VALIDATED DESIGN FOR RAG

        Deploy a digital assistant on Dell APEX Cloud Platform for Red Hat OpenShift

    • Let us help accelerate your journey

      Dell can help you remove barriers and enable enterprise-wide GenAI adoption through an end-to-end holistic approach from the deskside to the data center.

    • PowerEdge Servers for AI

    • Storage for AI

    • Data Management for AI

    • Precision Workstations

    • AI-Capable Laptops

    • Professional Services for AI

    • 1 Based on Enterprise Strategy Group research commissioned by Dell, “Maximizing AI ROI: Inferencing On-premises With Dell Technologies Can Be 75% More Cost-effective Than Public Cloud” comparing on-premises Dell infrastructure versus native public cloud infrastructure as a service and token-based APIs, April, 2024. Expected costs were modeled utilizing RAG for small (5k users), medium (10K users) and large (50K users) and two LLMs (7B and 70B parameters) over 3 years. Actual results may vary. Economic Summary