TL;DR: Discover how to embed AI-powered smart location search into WPF Maps control using OpenAI’s advanced technologies. This blog covers setting up Azure OpenAI, configuring WPF Maps, and displaying custom markers based on user input.
Syncfusion WPF Maps control is a powerful data visualization component that displays statistical information for a geographical area. Its highly interactive and customizable features include zooming, panning, selection, markers, legends, bubbles, and color mapping. Use this control to render sales, political boundary, weather, electoral, and route maps.
In this blog, we’ll explore how to integrate AI-powered smart location search capabilities in the WPF Maps control using OpenAI’s advanced technologies. This helps us quickly locate and identify specific places on the map more precisely.
Let’s begin!
Start by opening Visual Studio and creating a new WPF project.
Before diving into AI-powered location search, ensure you have access to Azure OpenAI and have set up a deployment in the Azure portal. Add the Azure.AI.OpenAI NuGet package from the NuGet Gallery.
Once you’ve obtained your API key and endpoint, follow these steps:
We’ll use the GPT-4O model for text-based queries and DALL-E for image generation. First, set up the OpenAIClient as shown in the following code example.
internal class AzureOpenAIService { const string endpoint = "https://{YOUR_END_POINT}.openai.azure.com"; const string deploymentName = "GPT-4O"; const string imageDeploymentName = "DALL-E"; string key = "API key"; OpenAIClient? azureClient; ChatCompletionsOptions? chatCompletions; internal AzureOpenAIService() { } }
Now, connect to Azure OpenAI. Refer to the following code.
// At the time of required. this.azureClient = new OpenAIClient(new Uri(endpoint), new AzureKeyCredential(key));
This enables you to communicate with the model, submit prompts, and receive responses, which can then be used to create map markers in WPF Maps.
Next, implement the GetResponseFromAI and GetImageFromAI methods to fetch responses from the OpenAI API based on the user input.
/// /// Retrieves an answer from the deployment name model using the provided user prompt. /// ///The user prompt. /// The AI response. internal async Task GetResponseFromOpenAI(string userPrompt) { this.chatCompletions = new ChatCompletionsOptions { DeploymentName = deploymentName, Temperature = (float)0.5, MaxTokens = 800, NucleusSamplingFactor = (float)0.95, FrequencyPenalty = 0, PresencePenalty = 0, }; if (this.AzureClient != null) { // Add the user's prompt as a user message to the conversation. this.chatCompletions?.Messages.Add(new ChatRequestSystemMessage("You are a predictive analytics assistant.")); // Add the user's prompt as a user message to the conversation. this.chatCompletions?.Messages.Add(new ChatRequestUserMessage(userPrompt)); try { // Send the chat completion request to the OpenAI API and await the response. var response = await this.AzureClient.GetChatCompletionsAsync(this.chatCompletions); // Return the content of the first choice in the response, which contains the AI's answer. return response.Value.Choices[0].Message.Content; } catch { // If an exception occurs (e.g., network issues, API errors), return an empty string. return ""; } } return ""; } /// /// Method to get the image from AI. /// /// The location name /// The bitmap image public async Task GetImageFromAI(string locationName) { var imageGenerations = await azureClient.GetImageGenerationsAsync( new ImageGenerationOptions() { Prompt = $"Share the {locationName} image.", Size = ImageSize.Size1024x1024, Quality = ImageGenerationQuality.Standard, DeploymentName = "DALL-E", }); Uri imageUri = imageGenerations.Value.Data[0].Url; var bitmapImage = new BitmapImage(); bitmapImage.BeginInit(); bitmapImage.UriSource = imageUri; bitmapImage.CacheOption = BitmapCacheOption.OnLoad; bitmapImage.EndInit(); return bitmapImage; }
With this setup, your AzureOpenAIService can communicate with the OpenAI API and fetch results based on user inputs.
Let’s design a location search feature using the WPF AutoComplete control and then map the selected location using the WPF Maps control. Before starting, ensure you have referred to the getting started documentation for both the Syncfusion WPF Maps and AutoComplete controls.
Create a model to represent geographic information that will be displayed as markers on the WPF Maps.
/// <summary> /// Represents information about a geographic location, including its name, details, coordinates, and address. /// </summary> public class LocationInfo { /// <summary> /// Gets or sets the name of the location. /// </summary> public string Name { get; set; } /// <summary> /// Gets or sets additional details about the location. /// </summary> public string Details { get; set; } /// <summary> /// Gets or sets the longitude coordinate of the location. /// </summary> public string Longitude { get; set; } /// <summary> /// Gets or sets the latitude coordinate of the location. /// </summary> public string Latitude { get; set; } /// <summary> /// Gets or sets the address of the location. /// </summary> public string Address { get; set; } /// <summary> /// Gets or sets the image source. /// </summary> public ImageSource ImageSource { get; set; } /// <summary> /// Gets the credential valid status. /// </summary> public bool IsOnline { get; set; } }
Now, incorporate a maps imagery layer into your WPF Maps to search for and locate landmarks based on user input.
xmlns:syncfusion=http://schemas.syncfusion.com/wpf <syncfusion:SfMap x:Name="map" EnableZoom="True" MaxZoom="20" MinZoom="4" ZoomLevel="4"> <syncfusion:SfMap.Layers> <syncfusion:ImageryLayer x:Name="imageLayer" LayerType="OSM" MarkerTemplate="{StaticResource markerTemplate}" Center="37.0902, -95.7129"> <syncfusion:ImageryLayer.MarkerToolTipSettings> <syncfusion:ToolTipSetting ShowDuration="3000" StrokeThickness="0" ToolTipTemplate="{StaticResource DynamicToolTipTemplate}" Background="WhiteSmoke" Margin="0"/> </syncfusion:ImageryLayer.MarkerToolTipSettings> </syncfusion:ImageryLayer> </syncfusion:SfMap.Layers> </syncfusion:SfMap>
You can customize the WPF Maps markers and tooltips to display relevant information to users.
<democommon:DemoControl.Resources> <local:StringToImageConverter x:Key="stringToImageConverter" /> <DataTemplate x:Key="markerTemplate"> <Grid> <Canvas Margin="-12,-30,0,0"> <Image Height="25" Source="Assets\Map\Images\pin.png" /> </Canvas> </Grid> </DataTemplate> <DataTemplate x:Key="DynamicToolTipTemplate"> <Border Background="{DynamicResource PopupBackground}" Height="auto" Width="250" Effect="{DynamicResource Default.ShadowDepth3}" CornerRadius="10"> <StackPanel Background="Transparent"> <Image x:Name="imageDynamic" Height="120" Stretch="Fill" /> <TextBlock Text="{Binding Data.Name}" FontSize="12" FontWeight="Bold" TextWrapping="Wrap" Padding="10,0,0,0" /> <TextBlock Text="{Binding Data.Details}" FontSize="10" FontWeight="SemiBold" TextWrapping="Wrap" Padding="10,0,0,0" /> <TextBlock Text="{Binding Data.Address}" FontSize="10" FontWeight="SemiBold" TextWrapping="Wrap" Padding="10,0,0,5" /> </StackPanel> </Border> <DataTemplate.Triggers> <DataTrigger Binding="{Binding Data.IsOnline}" Value="True"> <Setter TargetName="imageDynamic" Property="Source" Value="{Binding Data.ImageSource}" /> </DataTrigger> <DataTrigger Binding="{Binding Data.IsOnline}" Value="False"> <Setter TargetName="imageDynamic" Property="Source" Value="{Binding Data.Name, Converter={StaticResource stringToImageConverter}}"> </Setter> </DataTrigger> </DataTemplate.Triggers> </DataTemplate> </democommon:DemoControl.Resources>
Now, integrate the WPF Autocomplete control to allow users to input search terms that the AI service will process.
xmlns:syncfusion="http://schemas.syncfusion.com/wpf" <StackPanel Orientation="Horizontal" VerticalAlignment="Top" Margin="10"> <syncfusion:SfTextBoxExt x:Name="autoCompleteTextBox" Text="Hospitals in New York" HorizontalAlignment="Center" VerticalAlignment="Center" Width="320" Height="40" AutoCompleteMode="Suggest"> </syncfusion:SfTextBoxExt> <Button x:Name="searchButton" Background="Transparent" BorderBrush="Transparent" Grid.Column="1" Width="40" Height="40" Margin="-40,0,0,0"> <Path Grid.Column="1" x:Name="Search" Width="12" Height="12" Margin="8,9,8,9" HorizontalAlignment="Right" VerticalAlignment="Center" StrokeEndLineCap="Round" Data="M4.55051 6.5C5.18178 7.11859 6.04635 7.5 7 7.5C8.933 7.5 10.5 5.933 10.5 4C10.5 2.067 8.933 0.5 7 0.5C5.067 0.5 3.5 2.067 3.5 4C3.5 4.97934 3.90223 5.86474 4.55051 6.5ZM4.55051 6.5L1 10" Stretch="Fill" Stroke="{Binding ElementName=searchButton, Path=Foreground}"/> </Button> </StackPanel>
Add a prompt to request the AI service to convert user input into geographic locations in JSON format. The JSON data is then parsed into custom markers, which are added to WPF Maps using the Markers property of the ImageryLayer class.
Refer to the following code example.
/// /// Method to contain AI response and updates. /// ///The user query. /// The task details. private async Task GetRecommendationAsync(string userQuery) { if (this.autoCompleteTextBox == null || this.imageLayer == null || this.map == null) { return; } if (string.IsNullOrWhiteSpace(this.autoCompleteTextBox.Text)) { return; } if (this.busyIndicator != null) { this.busyIndicator.Visibility = Visibility.Visible; this.busyIndicator.IsBusy = true; } var returnMessage = string.Empty; if (azureAIService.AzureClient != null) { string prompt = $"Given location name: {userQuery}" + $"\nSome conditions need to follow:" + $"\nCheck the location name is just a state, city, capital or region, then retrieve the following fields: location name, detail, latitude, longitude, and set address value as null" + $"\nOtherwise, retrieve minimum 5 to 6 entries with following fields: location's name, details, latitude, longitude, address." + $"\nThe return format should be the following JSON format: markercollections[Name, Details, Latitude, Longitude, Address]" + $"\nRemove ```json and remove ``` if it is there in the code." + $"\nProvide JSON format details only, No need any explanation."; returnMessage = await azureAIService.GetResponseFromOpenAI(prompt); } var jsonObj = new JObject(); if (returnMessage == string.Empty) { if (this.autoCompleteTextBox.Text == "Hospitals in New York") { jsonObj = JObject.Parse(this.dataHelper.hospitalsData); } else if (this.autoCompleteTextBox.Text == "Hotels in Denver") { jsonObj = JObject.Parse(dataHelper.hotelsData); } else { return; } } else { jsonObj = JObject.Parse(returnMessage); } this.viewModel.Clear(); foreach (var marker in jsonObj["markercollections"]) { LocationInfo model = new LocationInfo { Name = (string)marker["Name"], Details = (string)marker["Details"], Latitude = (string)marker["Latitude"], Longitude = (string)marker["Longitude"], Address = (string)marker["Address"] }; if (azureAIService.AzureClient != null) { model.IsOnline = true; model.ImageSource = await azureAIService.GetImageFromAI(model.Name); } else { model.IsOnline = false; } this.viewModel.Add(model); } this.imageLayer.Markers = this.viewModel; if (this.viewModel.Count > 0) { var firstMarker = this.viewModel[0]; double latitude, longitude; if (double.TryParse(firstMarker.Latitude, out latitude) && double.TryParse(firstMarker.Longitude, out longitude)) { this.imageLayer.Center = new Point(latitude, longitude); } } this.map.ZoomLevel = 10; }
Refer to the following output image.
For more details, refer to the AI-powered smart location search in the WPF Maps GitHub demo.
Thanks for reading! In this blog, we’ve seen how to integrate an AI-powered smart location search feature in the Syncfusion WPF Maps control using Azure OpenAI. This will help us quickly locate and identify specific places on the map with greater precision. This feature is available in our latest 2024 Volume 3 release. Give it a try, and leave your feedback in the comments section given below!
Existing customers can download the new version of Essential Studio® on the License and Downloads page. If you are not a customer, try our 30-day free trial to check out these new features.
If you have questions, contact us through our support forums, feedback portal, or support portal. We are always happy to assist you!