I grew up playing tabletop role-playing games like Dungeons & Dragons and have spent more time leafing through books looking up the rules than I'd like to admit.
This is old news for those familiar with the hobby, but having to stop everything while someone looked something up was one of the least fun parts of playing D&D as a kid, and if you've ever sat and watched a friend waste five minutes trying to find the page with Lich King Karl's spell list on it, you'll know what I'm talking about.
One reason is that nobody likes knowing the rules (we're all here to have fun, not to pass a test), but stopping to look it up takes everyone out of the game and makes it less lively. Switching to using digital .pdfs instead of physical manuals has helped somewhat, because you can use a PDF reader like Adobe Acrobat to do text searches in these documents and find (more) things more quickly.
But using .pdf files on laptops or tablets at the table wasn't a perfect solution because players would often have to scroll through pages to find something they didn't know the name of or couldn't remember the details about. So when I saw that Adobe had added a new AI assistant to Acrobat that could answer questions about documents, I was intrigued. Maybe an AI assistant would make it faster and easier to play, run, and design a D&D campaign?
(Image courtesy of Future)
It's a wonderful dream. Imagine being able to pull a digital copy of a thick reference book onto your laptop and ask Adobe's AI any question you want, confident that you'll get accurate, specific details instantly. This would be a huge time-saver not just for tabletop gamers, but for students, researchers, and anyone who deals with large documents on a daily basis at work.
But after trying out Adobe AI Assistant this week, I'm sad to say that its current state is far from the dream. While the AI tool works as advertised (usually) and can be extremely helpful when parsing or understanding .pdfs, it does have some limitations that keep it from being the ultimate D&D assistant.
Big potential, half-hearted execution
To help you see what I mean, I'll walk you through the good (and not so good) things I've found so far about using Adobe's AI tools. Don't take my word for it: you can try it for free on the Adobe AI Assistant website.
The free version does have some limitations, though. First, you'll need to create an Adobe account to ask Acrobat anything beyond quick demo questions about a given document. Once you sign up, you'll only get five free requests to the AI before subscribing to Adobe's AI Assistant. During the company's current “early access pricing” period, it will cost between $4.99 and $6.99 per month (though Amazon is offering a discount for students through September 4th, bringing it down to $1.99 per month), and the price may increase in the future.
Right away, I ran into some limitations that you should be aware of when using this tool. First, when I tested it, the AI assistant claimed it could only handle documents up to 200 pages or 65MB in size. This means you can't chat with the AI about important texts like the D&D Dungeon Master's Guide or Player's Handbook.
(I contacted Adobe PR about this and was told that in fact it can only support documents up to 120 pages in size up to 25MB, but you could try using Adobe Acrobat's PDF split feature to split large books into chunks the AI can process.)
But we can get some value from using AI to parse short pieces of text, like indie RPGs, short adventures, or expansions for larger games like D&D.
However, the free version of Adobe's AI assistant seems to be less capable than the paid version I had a chance to test. When I was using the free version, Adobe's AI chatbot couldn't seem to reliably answer questions like, “What page is Elemental Bane on?” This is disappointing because it was one of the basic things I was hoping the AI chatbot would help me with.
Click to enlarge to see the responses Adobe's AI assistant got when asking questions like, “How many wizard spells are there?” (Image credit: Future)
But after upgrading to the paid version, the AI assistant suddenly started telling me what page Elemental Bane was on, except it was wrong, repeatedly giving me page numbers that were one page different from the actual page. To make matters worse, when I asked where the spell “Bones of the Earth” was located, the AI assistant gave me two incorrect page numbers, one of which was a page number that didn't exist (page 31 of a 25-page document).
Even summaries can be hit or miss. For example, when we asked Adobe's AI to summarize how choosing to play a genasi would benefit the player, it got about 75% correct, but left out some important details that less experienced players wouldn't. So, AI or not, we still need human intelligence to get the most out of the documents.
But I see a lot of promise in this tool, and I think that if Adobe can iron out the issues, this tool could actually revolutionize tabletop role-playing games (and many other reference materials).
For example, when I asked Adobe's AI assistant what 6th level spells were for wizards, it was spot on, returning a quick list of the six spells available in the .pdf I was referencing, with links to each source next to each spell's name. All of this worked, taking me directly to the page with the spell and displaying a blue highlight box around the spell text in question.
Unfortunately, those blue highlight boxes were always a little inaccurate, omitting half the text and highlighting half the previous entry, but at least the AI assistant could be used to quickly look up what spells a wizard could use and click to instantly jump to a page with more details. This could be a huge timesaver for both players and Dungeon Masters.
Adobe AI Assistant: What's next
Like every AI assistant I've tested in the past year-plus since Microsoft dropped Bing for ChatGPT, Adobe's chatbot has room for improvement.
My quick testing showed that Adobe's AI is capable of basic summarization and data searching in .pdfs, similar to chatbots like Google Gemini on the web, but is prone to errors. It can be useful, and can save you time if you practice using it, but the large number of errors in the results shows that you need to understand your subject matter pretty well and be able to fact-check the AI to get good results.
But there's a promise here, and I think it's a good promise: if Adobe can improve this service to a level where it's reliably accurate and works well enough to answer basic questions about a given piece of text, like “what page is this on” or “how does this subject work,” then the AI assistant could be a real game-changer for my D&D sessions.