Embedding a live AI browser agent in your React app with Amazon Bedrock

Source
Embedding a live AI browser agent in your React app with Amazon Bedrock

When building AI-powered applications, users must understand and trust AI agents that interact with web content on their behalf. When an agent autonomously interacts with web content, users need visibility into those actions to maintain confidence and control. The Amazon Bedrock AgentCore Browser BrowserLiveView component addresses this challenge by providing a real-time video feed of the agent's browsing session directly within your React application.

This component, part of the Bedrock AgentCore TypeScript SDK, streamlines integration by embedding a live browser stream with just three lines of JavaScript XML (JSX). The BrowserLiveView component uses the Amazon DCV protocol to render the browser session, creating transparency into agent actions. Implementation requires only a presigned URL from your server, without the need to build streaming infrastructure.

This post walks you through three steps: starting a session and generating the Live View URL, rendering the stream in your React application, and wiring up an AI agent that drives the browser while users watch. By the end, you will have a working sample application that you can clone and run.

Embedding Live View inside your application unlocks additional value for your users at scale. With an embedded Live View, users can follow every navigation, form submission, and search query as the agent performs them. They receive immediate visual confirmation that the agent is on the right page, interacting with the correct elements, and progressing through the workflow. This real-time feedback loop gives end users direct insight into agent behavior without waiting for the final result.

Users who delegate browsing tasks to an AI agent are more confident when they can observe the work. Watching the agent fill in a form field by field is more reassuring than receiving a text confirmation. For regulated workflows, visual evidence of agent actions can support audit requirements. In workflows that require human supervision, such as handling customer accounts and processing sensitive data, a supervisor can use the embedded Live View to watch the agent in real-time and intervene if needed, without leaving your application.

Organizations also gain audit trail support through visual evidence of agent actions, which proves valuable for compliance requirements and troubleshooting scenarios. Combined with session recordings to Amazon Simple Storage Service (Amazon S3) and console-based session replay, you get both real-time observation and post-hoc review.

The integration consists of three components. The user's web browser runs a React application containing the BrowserLiveView component, which receives a SigV4-presigned URL and establishes a persistent WebSocket connection to receive the DCV video stream from a remote browser session. The React application handles video rendering and user interface presentation while maintaining the WebSocket connection for continuous streaming. The application server functions as an AI agent within the Amazon Bedrock session lifecycle, orchestrating the connection between client browsers and cloud-hosted browser sessions.

Related articles