DnDialogueGenerator

DnDialogueGenerator

Deep Machine Learning · NLP · Fine-tuning

This project explores an automated tool to help Game Masters generate dynamic, context-aware NPC dialogue for Dungeons & Dragons campaigns. I fine-tuned a GPT-2 model on dialogue data annotated with categories and character personalities to produce dialogue that better matches specific character traits and improves narrative continuity.

What's Inside

Problem & Motivation

Why NPC dialogue generation is hard and what existing tools miss, continuity and character consistency are critical for immersive roleplay experiences.

Model & Training Approach

GPT-2 decoder-only transformer; fine-tuned via next-word prediction with selective layer training (frozen layers except the LM head). This approach allows efficient adaptation to dialogue generation without catastrophic forgetting.

Data Pipeline

Preprocessing format that encodes categories and personalities followed by multi-line dialogue turns. This structured approach enables the model to condition generation on character traits.

Dataset Choice

Used a personality-labelled conversation dataset as a substitute for a dedicated D&D dataset. While not domain-specific, this approach demonstrates transfer learning principles.

Results Summary

Strengths

  • Generated text resembles roleplaying dialogue and is often relevant to the player's prompt

Limitations Observed

  • Sometimes continues generating category/personality tokens even after the end-of-text token
  • Tends toward romantic dialogue patterns due to dataset bias
  • Personality conditioning is not always respected consistently

Planned Improvements

  • Train longer / to convergence; try larger GPT-2 variants
  • Consider adjusting the language-model head and/or briefly unfreezing more layers at low learning rate
  • Collect domain-specific D&D dialogue data for better fine-tuning
  • Implement better post-processing to handle token cleanup

Credits

Poster authors: Linus Lundgren, Maria Madalena Barros — Chalmers University of Technology.

Project Information

  • Category Deep Machine Learning
  • Focus Areas NLP, Fine-tuning, Transformers
  • Model GPT-2
  • University Chalmers University of Technology