Skip to content

Commit 179c8a3

Browse files
committed
branding fixes
1 parent 762ebaf commit 179c8a3

Some content is hidden

Large Commits have some content hidden by default. Use the searchbox below for content that may be hidden.

42 files changed

+106
-106
lines changed

README.MD

Lines changed: 7 additions & 7 deletions
Original file line numberDiff line numberDiff line change
@@ -13,20 +13,20 @@ If you will be delivering this session, check the [session-delivery-sources](./s
1313

1414
### Session Description
1515

16-
Build a conversational AI agent for Zava, a retail DIY company, that analyzes sales data and helps customers find products. Learn to create secure, intelligent agents using Azure AI Foundry Agent Service, Model Context Protocol (MCP) for external data connections, and PostgreSQL with Row Level Security (RLS) and pgvector for role-based data protection and semantic search.
16+
Build a conversational AI agent for Zava, a retail DIY company, that analyzes sales data and helps customers find products. Learn to create secure, intelligent agents using Microsoft Foundry Agent Service, Model Context Protocol (MCP) for external data connections, and PostgreSQL with Row Level Security (RLS) and pgvector for role-based data protection and semantic search.
1717

1818
### 🧠 Learning Outcomes
1919

2020
By the end of this session, learners will be able to:
2121

22-
1. **Azure AI Foundry Agent Service**: Build and deploy AI agents with integrated tools and observability.
22+
1. **Microsoft Foundry Agent Service**: Build and deploy AI agents with integrated tools and observability.
2323
2. **Model Context Protocol (MCP)**: Connects the Agent Service to external tools and data over industry standard protocols to enhance agent functionality.
2424
3. **PostgreSQL**: Use PostgreSQL as a vector database for semantic search and implement Row Level Security (RLS) to protect sensitive data based on user roles.
25-
4. **Azure AI Foundry**: An enterprise-grade AI development platform providing unified model access, comprehensive monitoring, distributed tracing capabilities, and production-ready governance for AI applications at scale.
25+
4. **Microsoft Foundry**: An enterprise-grade AI development platform providing unified model access, comprehensive monitoring, distributed tracing capabilities, and production-ready governance for AI applications at scale.
2626

2727
### 💻 Technologies Used
2828

29-
1. Azure AI Foundry
29+
1. Microsoft Foundry
3030
1. PostgreSQL including Row Level Security (RLS) and Semantic Search with the pgvector extension
3131
1. Model Context Protocol (MCP)
3232

@@ -36,7 +36,7 @@ By the end of this session, learners will be able to:
3636
|:-------------------|:----------------------------------|:-------------------|
3737
| Workshop Repository | [Unlock your Agents Potential with MCP and PostgreSQL](https://github.com/microsoft/Unlock-Your-Agents-Potential-with-MCP-and-PostgreSQL) | Workshop materials and resources |
3838
| Workshop Docs | [Workshop Documentation](https://microsoft.github.io/aitour26-WRK540-unlock-your-agents-potential-with-model-context-protocol/) | Workshop documentation site |
39-
| Documentation | [Azure AI Foundry](https://learn.microsoft.com/azure/ai-foundry/)| Azure AI Foundry documentation |
39+
| Documentation | [Microsoft Foundry](https://learn.microsoft.com/azure/ai-foundry/)| Microsoft Foundry documentation |
4040
| Module | [Fundamentals of AI agents on Azure](https://learn.microsoft.com/training/modules/ai-agent-fundamentals/)| Training module on AI agent fundamentals |
4141
| Documentation | [Tracing using Application Insights](https://learn.microsoft.com/azure/ai-services/agents/concepts/tracing)| Guide to tracing with Application Insights |
4242
| Documentation | [Evaluating your AI agents with Azure AI Evaluation SDK](https://learn.microsoft.com/azure/ai-foundry/how-to/develop/agent-evaluate-sdk)| AI agent evaluation documentation |
@@ -96,11 +96,11 @@ Microsoft’s approach to responsible AI is grounded in our AI principles of f
9696

9797
Large-scale natural language, image, and speech models - like the ones used in this sample - can potentially behave in ways that are unfair, unreliable, or offensive, in turn causing harms. Please consult the [Azure OpenAI service Transparency note](https://learn.microsoft.com/legal/cognitive-services/openai/transparency-note?tabs=text) to be informed about risks and limitations.
9898

99-
The recommended approach to mitigating these risks is to include a safety system in your architecture that can detect and prevent harmful behavior. [Azure AI Content Safety](https://learn.microsoft.com/azure/ai-services/content-safety/overview) provides an independent layer of protection, able to detect harmful user-generated and AI-generated content in applications and services. Azure AI Content Safety includes text and image APIs that allow you to detect material that is harmful. Within Azure AI Foundry portal, the Content Safety service allows you to view, explore and try out sample code for detecting harmful content across different modalities. The following [quickstart documentation](https://learn.microsoft.com/azure/ai-services/content-safety/quickstart-text?tabs=visual-studio%2Clinux&pivots=programming-language-rest) guides you through making requests to the service.
99+
The recommended approach to mitigating these risks is to include a safety system in your architecture that can detect and prevent harmful behavior. [Azure AI Content Safety](https://learn.microsoft.com/azure/ai-services/content-safety/overview) provides an independent layer of protection, able to detect harmful user-generated and AI-generated content in applications and services. Azure AI Content Safety includes text and image APIs that allow you to detect material that is harmful. Within Microsoft Foundry portal, the Content Safety service allows you to view, explore and try out sample code for detecting harmful content across different modalities. The following [quickstart documentation](https://learn.microsoft.com/azure/ai-services/content-safety/quickstart-text?tabs=visual-studio%2Clinux&pivots=programming-language-rest) guides you through making requests to the service.
100100

101101
Another aspect to take into account is the overall application performance. With multi-modal and multi-models applications, we consider performance to mean that the system performs as you and your users expect, including not generating harmful outputs. It's important to assess the performance of your overall application using [Performance and Quality and Risk and Safety evaluators](https://learn.microsoft.com/azure/ai-studio/concepts/evaluation-metrics-built-in). You also have the ability to create and evaluate with [custom evaluators](https://learn.microsoft.com/azure/ai-studio/how-to/develop/evaluate-sdk#custom-evaluators).
102102

103-
You can evaluate your AI application in your development environment using the [Azure AI Evaluation SDK](https://microsoft.github.io/promptflow/index.html). Given either a test dataset or a target, your generative AI application generations are quantitatively measured with built-in evaluators or custom evaluators of your choice. To get started with the azure ai evaluation sdk to evaluate your system, you can follow the [quickstart guide](https://learn.microsoft.com/azure/ai-studio/how-to/develop/flow-evaluate-sdk). Once you execute an evaluation run, you can [visualize the results in Azure AI Foundry portal](https://learn.microsoft.com/azure/ai-studio/how-to/evaluate-flow-results).
103+
You can evaluate your AI application in your development environment using the [Azure AI Evaluation SDK](https://microsoft.github.io/promptflow/index.html). Given either a test dataset or a target, your generative AI application generations are quantitatively measured with built-in evaluators or custom evaluators of your choice. To get started with the azure ai evaluation sdk to evaluate your system, you can follow the [quickstart guide](https://learn.microsoft.com/azure/ai-studio/how-to/develop/flow-evaluate-sdk). Once you execute an evaluation run, you can [visualize the results in Microsoft Foundry portal](https://learn.microsoft.com/azure/ai-studio/how-to/evaluate-flow-results).
104104

105105
## Compiling the Python Requirements
106106

docs/docs/en/agent-service-overview.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,6 +1,6 @@
11
The Foundry Agent Service offers a fully managed cloud service with SDKs for [Python](https://learn.microsoft.com/azure/ai-services/agents/quickstart?pivots=programming-language-python-azure){:target="_blank"}, [C#](https://learn.microsoft.com/azure/ai-services/agents/quickstart?pivots=programming-language-csharp){:target="_blank"}, and [TypeScript](https://learn.microsoft.com/azure/ai-foundry/agents/quickstart?pivots=programming-language-typescript){:target="_blank"}. The Foundry SDKs simplify AI agent development, reducing complex tasks like tool calling MCP Server tools with just a few lines of code.
22

3-
The Azure AI Foundry Agent Service offers several advantages over traditional agent platforms:
3+
The Microsoft Foundry Agent Service offers several advantages over traditional agent platforms:
44

55
- **Rapid Deployment**: Optimized SDK for fast deployment, letting developers focus on building agents.
66
- **Scalability**: Designed to handle varying user loads without performance issues.

docs/docs/en/architecture.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -1,6 +1,6 @@
11
## Core technologies at a glance
22

3-
- **Azure AI Foundry Agent Service**
3+
- **Microsoft Foundry Agent Service**
44
Hosts the LLM-driven agent; orchestrates tools (including MCP Servers); manages context, Code Interpreter, and token streaming; and provides authentication, logging, and scaling.
55
- **MCP Servers**
66
MCP (Model Context Protocol) is an open standard that gives LLMs a unified interface to external tools, APIs, and data. It standardizes tool discovery (like OpenAPI for REST) and improves composability by making tools easy to update or swap as needs evolve.
@@ -13,7 +13,7 @@
1313

1414
The Zava Sales Analysis solution architecture includes:
1515

16-
- An **Azure AI Foundry Agent Service** instance that hosts the Zava Sales Analysis agent.
16+
- An **Microsoft Foundry Agent Service** instance that hosts the Zava Sales Analysis agent.
1717
- A **PostgreSQL** database with the **pgvector** extension, storing Zava sales data and embeddings.
1818
- An **MCP Server** that exposes the PostgreSQL database to the agent via MCP.
1919
- An **Agent Manager** app that manages the interaction between the user and the agent.

docs/docs/en/hw-1-start-mcp-server.md

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -7,7 +7,7 @@ In this lab, you will:
77

88
## Introduction
99

10-
The Model Context Protocol (MCP) server is a crucial component that handles the communication between Large Language Models (LLMs) and external tools and data sources. You’ll run the MCP server on your local machine, but the Azure AI Foundry Agent Service requires internet access to connect to it. To make your local MCP server accessible from the internet, you’ll use a DevTunnel. This allows the Agent Service to communicate with your MCP server as if it were running as a service in Azure.
10+
The Model Context Protocol (MCP) server is a crucial component that handles the communication between Large Language Models (LLMs) and external tools and data sources. You’ll run the MCP server on your local machine, but the Microsoft Foundry Agent Service requires internet access to connect to it. To make your local MCP server accessible from the internet, you’ll use a DevTunnel. This allows the Agent Service to communicate with your MCP server as if it were running as a service in Azure.
1111

1212
## Interface options for MCP
1313

@@ -16,14 +16,14 @@ MCP supports two main interfaces for connecting LLMs with tools:
1616
- **Streamable HTTP Transport**: For web-based APIs and services.
1717
- **Stdio Transport**: For local scripts and command-line tools.
1818

19-
This lab uses the Streamable HTTP transport interface to integrate with the Azure AI Foundry Agent Service.
19+
This lab uses the Streamable HTTP transport interface to integrate with the Microsoft Foundry Agent Service.
2020

2121
!!! note
2222
Normally, you'd deploy the MCP server in a production environment, but for this workshop, you'll run it locally in your development environment. This allows you to test and interact with the MCP tools without needing a full deployment.
2323

2424
### Start up a DevTunnel for the MCP Server
2525

26-
1. In a new terminal, authenticate DevTunnel. You'll be prompted to log in with your Azure account, use the same account you used to log in to the Azure AI Foundry Agent Service or Azure Portal. Run the following command:
26+
1. In a new terminal, authenticate DevTunnel. You'll be prompted to log in with your Azure account, use the same account you used to log in to the Microsoft Foundry Agent Service or Azure Portal. Run the following command:
2727

2828
```bash
2929
devtunnel login

docs/docs/en/index.md

Lines changed: 6 additions & 6 deletions
Original file line numberDiff line numberDiff line change
@@ -12,14 +12,14 @@ You need to analyze sales data to find trends, understand customer preferences,
1212

1313
Learn to build an AI agent that analyzes sales data, answers product questions, and helps customers find products. Key topics:
1414

15-
1. **Azure AI Foundry Agent Service**: Build and deploy AI agents with integrated tools and observability.
15+
1. **Microsoft Foundry Agent Service**: Build and deploy AI agents with integrated tools and observability.
1616
2. **Model Context Protocol (MCP)**: Connects the Agent Service to external tools and data over industry standard protocols to enhance agent functionality.
1717
3. **PostgreSQL**: Use PostgreSQL as a vector database for semantic search and implement Row Level Security (RLS) to protect sensitive data based on user roles.
18-
4. **Azure AI Foundry**: An enterprise-grade AI development platform providing unified model access, comprehensive monitoring, distributed tracing capabilities, and production-ready governance for AI applications at scale.
18+
4. **Microsoft Foundry**: An enterprise-grade AI development platform providing unified model access, comprehensive monitoring, distributed tracing capabilities, and production-ready governance for AI applications at scale.
1919

2020
### Just starting your AI Agents journey?
2121

22-
New to AI agents? Start with [Build your code-first agent with Azure AI Foundry](https://aka.ms/aitour/WRK552){:target="_blank"}. You'll build a code-first agent integrating LLMs with databases, documents, and Bing Search—a solid foundation for advanced agents like Zava.
22+
New to AI agents? Start with [Build your code-first agent with Microsoft Foundry](https://aka.ms/aitour/WRK552){:target="_blank"}. You'll build a code-first agent integrating LLMs with databases, documents, and Bing Search—a solid foundation for advanced agents like Zava.
2323

2424
## What is an LLM-Powered AI Agent?
2525

@@ -29,8 +29,8 @@ Example: If a user asks, "Show total sales per store as a pie chart", the agent
2929

3030
This shifts much of the application logic from developers to the model. Clear instructions and dependable tools are essential for predictable agent behavior and results.
3131

32-
## Introduction to the Azure AI Foundry
32+
## Introduction to the Microsoft Foundry
3333

34-
[Azure AI Foundry](https://azure.microsoft.com/products/ai-foundry/){:target="_blank"} is Microsoft’s secure, flexible platform for designing, customizing, and managing AI apps and agents. Everything—models, agents, tools, and observability—lives behind a single portal, SDK, and REST endpoint, so you can ship to cloud or edge with governance and cost controls in place from day one.
34+
[Microsoft Foundry](https://azure.microsoft.com/products/ai-foundry/){:target="_blank"} is Microsoft’s secure, flexible platform for designing, customizing, and managing AI apps and agents. Everything—models, agents, tools, and observability—lives behind a single portal, SDK, and REST endpoint, so you can ship to cloud or edge with governance and cost controls in place from day one.
3535

36-
![Azure AI Foundry Architecture](media/azure-ai-foundry.png)
36+
![Microsoft Foundry Architecture](media/azure-ai-foundry.png)

docs/docs/en/lab-2-start-the-agent.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -177,13 +177,13 @@ From the web chat client, you can start a conversation with the agent. The agent
177177
!!! tip
178178
=== "Python"
179179
180-
Switch back to VS Code and select **MCP Server (workspace)** from the TERMINAL panel and you'll see the calls made to the MCP Server by the Azure AI Foundry Agent Service.
180+
Switch back to VS Code and select **MCP Server (workspace)** from the TERMINAL panel and you'll see the calls made to the MCP Server by the Microsoft Foundry Agent Service.
181181
182182
![](../media/mcp-server-in-action.png)
183183
184184
=== "C#"
185185
186-
In the Aspire dashboard, you can select the logs for the `dotnet-mcp-server` resource to see the calls made to the MCP Server by the Azure AI Foundry Agent Service.
186+
In the Aspire dashboard, you can select the logs for the `dotnet-mcp-server` resource to see the calls made to the MCP Server by the Microsoft Foundry Agent Service.
187187
188188
You can also open the trace view and find the end-to-end trace of the application, from the user input in the web chat, through to the agent calls and MCP tool calls.
189189

docs/docs/en/lab-5-monitoring.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -1,12 +1,12 @@
11
## Introduction
22

3-
Monitoring keeps your Azure AI Foundry Agent Service available, performant, and reliable. Azure Monitor collects metrics and logs, provides real‑time insights, and sends alerts. Use dashboards and custom alerts to track key metrics, analyze trends, and respond proactively. Access monitoring via the Azure portal, CLI, REST API, or client libraries.
3+
Monitoring keeps your Microsoft Foundry Agent Service available, performant, and reliable. Azure Monitor collects metrics and logs, provides real‑time insights, and sends alerts. Use dashboards and custom alerts to track key metrics, analyze trends, and respond proactively. Access monitoring via the Azure portal, CLI, REST API, or client libraries.
44

55
## Lab Exercise
66

77
1. From the VS Code file explorer, open the `resources.txt` file in the `workshop` folder.
88
1. **Copy** the value for the `AI Project Name` key to the clipboard.
9-
1. Navigate to the [Azure AI Foundry Portal](https://ai.azure.com){:target="_blank"} page.
9+
1. Navigate to the [Microsoft Foundry Portal](https://ai.azure.com){:target="_blank"} page.
1010
1. Select your project from the list of foundry projects.
1111

1212
## Open the Monitoring dashboard

docs/docs/en/lab-6-tracing.md

Lines changed: 5 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -1,6 +1,6 @@
11
## Introduction
22

3-
Tracing helps you understand and debug your agent's behavior by showing the sequence of steps, inputs, and outputs during execution. In Azure AI Foundry, tracing lets you observe how your agent processes requests, calls tools, and generates responses. You can use the Azure AI Foundry portal or integrate with OpenTelemetry and Application Insights to collect and analyze trace data, making it easier to troubleshoot and optimize your agent.
3+
Tracing helps you understand and debug your agent's behavior by showing the sequence of steps, inputs, and outputs during execution. In Microsoft Foundry, tracing lets you observe how your agent processes requests, calls tools, and generates responses. You can use the Microsoft Foundry portal or integrate with OpenTelemetry and Application Insights to collect and analyze trace data, making it easier to troubleshoot and optimize your agent.
44

55
<!-- ## Lab Exercise
66
@@ -40,13 +40,13 @@ Write an executive report that analysis the top 5 product categories and compare
4040

4141
## View Traces
4242

43-
You can view the traces of your agent's execution in the Azure AI Foundry portal or by using OpenTelemetry. The traces will show the sequence of steps, tool calls, and data exchanged during the agent's execution. This information is crucial for debugging and optimizing your agent's performance.
43+
You can view the traces of your agent's execution in the Microsoft Foundry portal or by using OpenTelemetry. The traces will show the sequence of steps, tool calls, and data exchanged during the agent's execution. This information is crucial for debugging and optimizing your agent's performance.
4444

45-
### Using Azure AI Foundry Portal
45+
### Using Microsoft Foundry Portal
4646

47-
To view traces in the Azure AI Foundry portal, follow these steps:
47+
To view traces in the Microsoft Foundry portal, follow these steps:
4848

49-
1. Navigate to the [Azure AI Foundry](https://ai.azure.com/) portal.
49+
1. Navigate to the [Microsoft Foundry](https://ai.azure.com/) portal.
5050
2. Select your project.
5151
3. Select the **Tracing** tab in the left-hand menu.
5252
4. Here, you can see the traces generated by your agent.

0 commit comments

Comments
 (0)