NVIDIA just released a of a chatbot that This is pretty neat, as it gives the chatbot access to your files and documents. You can feed Chat with RTX a selection of personal data and have it create summaries based on that information. You can also ask it questions, just like any chatbot, and it’ll dive into your data for answers.

The company says it allows users to “quickly, easily connect local files on a PC as a dataset to an open-source large language model like Mistral or Llama 2.” NVIDIA gives an example of a user asking the chatbot about a restaurant their partner recommended while in Las Vegas. The software scans local files to find the answer. It supports a whole bunch of file formats, including .txt, .pdf, .doc/.docx and .xml. The company says it’ll load relevant files into its dataset “in seconds.”

Chat with RTX also integrates YouTube videos and playlists. You can add a video URL into the dataset and it’ll integrate the knowledge contained in the clip for contextual queries. NVIDIA says this will be useful when asking for travel recommendations “based on content from favorite influencer videos” or when looking for tutorials and summaries derived from educational resources.

The Verge with the chatbot and came away impressed, writing that they could see it as “a valuable part of data research for journalists or anyone who needs to analyze a collection of documents.”

This sounds like a big step toward something resembling an actual digital assistant that works within the contextual framework of your personal data. With most chatbots, the data is sent off to the cloud, but Chat with RTX “lets users process sensitive data on a local PC without the need to share it with a third party or have an internet connection.” So it’s safer and more contextually aware.

There are some limitations. This is a demo product so you should expect plenty of bugs, though NVIDIA should start squashing them once users begin issuing error reports and the like. There are also some strict hardware limitations here. Chat with RTX only works on Windows PCs with NVIDIA GeForce RTX 30 Series GPUs or higher and at least 8GB of VRAM.

NVIDIA has really been showing off its AI prowess lately, as the company just launched its next-generation of . It’s due primarily to the company’s AI and data center segments.

This article contains affiliate links; if you click such a link and make a purchase, we may earn a commission.

Source link