First Tool Release: Content Manager

After 7 weeks of brushing up on my React, SQL, and auth skills, I have finally released our first tool -- a content manager! My motivation for creating this tool was to enable those who are working on the project from a design perspective, but cannot write code, with more ways to contribute to our game. Now I can continue coding and improving our clients while others are submitting content that will be included in all future releases.

Currently access to using the tool is restricted. If you would like to contribute, please see the new Content Ideas forum to post your thoughts and if liked, we will get them added to the game. If you're a frequent poster with ideas that we accept, we may give you submitter access directly.

What Does the Tool Cover

This change ended up taking a lot more than I had anticipated. At first I had planned to only release a tool for modifying items and spells, but this quickly grew to covering most entities in the game. One of the challenges I faced was mixing my test data for something like the list of items for a merchant, with the item definitions I was now creating on my remote server. There was probably a solution for mixing them together, and I would have had the manager for items/spells ready 3 weeks ago, but I decided to take the extra time and cover as much content as I could.

  • Assets (Item Icons, Spell Icons, Equipment Models, Spell Particles)
  • Items
  • Factions
  • Npc Factions / Faction Awards
  • Spells
  • Spell Lines
  • Zones
  • Loot Drops / Loot Tables
  • Merchants
  • Npc Templates
  • Spawns / Spawn Groups / Waypoints
  • Race Class Deity Starting Locations, Items

To further give an idea of how big some of these are.. the item table has a little over 100 fields. Though to be fair, only that table is over 100 and only 2 others are above 50. Many of the rest are small.

Using the Tool

So what does the tool look like? How does it work?

As I mentioned in my previous post, the service behind this tool simply wraps our database entities into an API. The API is pretty generic, and now exposes normal fetch operations such as pagination and filtering.

List View of Item Icons (Asset Entity w/ a Item Icon assetType filter)

Generically speaking, when clicking on a row in the entity list, you are brought to the view page for that entity.

View Item Detail Page

You can then click on the edit button and add changes, such as restricting the race/class/deity on an item:

Searching for other entities, such as the Clicky spell to be used on an item:

Or you can create an entirely new entity:

When saving the change, a content submitter can provide input of the cause/reason for the change.

Modifying the above record with a non admin user, the change moves into an unapproved state until an admin approves it. Now as an admin an extra (green) button for reviewing changes.

That then brings up a page which shows you the differences between the previously approved version and the current state.

The only other thing that can really be done is deleting content:

However, we don't really delete entities just in case we want to look them up later for reference -- or even restore them.

Exporting Approved Entities

With the data living in a separate tool, I needed to add a way for different processes to access the entities which have been approved. To do this, I I have an API which will take the request entities, convert them into the corresponding client vs server models, then return those as compressed files. We then now have a few scripts that can now generate login tokens with our authentication service and call the previously mentioned APIs to test the current set of submissions during local development. It is also being used during the release of new clients and/or game servers.

Downloading entities from the content manager

In my last post I showed our abstract class which contained queries for fetching submissions. One of the many changes I made since then is to allow for each Submission DAO to specify which client DTO it corresponds to as well as which server domain model it corresponds to. Then, reusable extensions can easily convert the entity between the client DTO and server domain models for exporting the data into a downloadable format.

using Funkhouse.Data.Factory.Interfaces;
using Funkhouse.Data.Session.Interfaces;
using Funkhouse.Logging.Interfaces;
using RPG.Common.Inventory.Models;
using RPG.Server.Core.Inventory.Extensions;
using RPG.Web.Service.Content.Data.Dao;
using RPG.Web.Service.Content.Inventory.Entities;
using RPG.Web.Service.Content.Submission.Content.Types;

namespace RPG.Web.Service.Content.Inventory.Dao {
    public class ItemSubmissionDao : AbstractSubmissionDao<ItemSubmissionEntity, RPGItem> {
        public override ContentSubmissionType ContentType => ContentSubmissionType.Item;
        public ItemSubmissionDao(IDatabaseSessionFactory dbSessionFactory, IDataTypeFactory dataTypeFactory, SubmissionDao submissionDao, IFunkyLogger logger)
            : base(dbSessionFactory, dataTypeFactory, submissionDao, logger) { }

        protected override RPGItem GetDtoFromEntity(ItemSubmissionEntity entity) {
            return entity

        protected override object GetModelFromEntity(ItemSubmissionEntity entity) {
            return entity.FromEntity();

Some of the code can use a bit of polishing, but after so many weeks I really needed to get things checked in. So be forewarned.. a good amount of the code I shared here is subject to future refactors 😉 such as this converter to quickly export to/from a zip file:

using Newtonsoft.Json;
using Newtonsoft.Json.Serialization;
using System.Collections.Generic;
using System.IO;
using System.IO.Compression;
using System.Reflection;

namespace RPG.Common.Packaging {
    public static class EntityIO {
        public class ContractResolverWithPrivates : CamelCasePropertyNamesContractResolver {
            protected override JsonProperty CreateProperty(MemberInfo member, MemberSerialization memberSerialization) {
                var prop = base.CreateProperty(member, memberSerialization);

                if (!prop.Writable) {
                    var property = member as PropertyInfo;
                    if (property != null) {
                        var hasPrivateSetter = property.GetSetMethod(true) != null;
                        prop.Writable = hasPrivateSetter;

                return prop;

        private static JsonSerializerSettings GetSerializerSettings() {
            return new JsonSerializerSettings {
                TypeNameHandling = TypeNameHandling.Objects,
                TypeNameAssemblyFormatHandling = TypeNameAssemblyFormatHandling.Simple,
                ConstructorHandling = ConstructorHandling.AllowNonPublicDefaultConstructor,
                ContractResolver = new ContractResolverWithPrivates()

        private static MemoryStream WriteToStream<T>(string fileName, List<T> entities) {
            var json = JsonConvert.SerializeObject(entities, Formatting.None, GetSerializerSettings());

            using (var zipStream = new MemoryStream()) {
                using (ZipArchive zip = new ZipArchive(zipStream, ZipArchiveMode.Create, true)) {
                    ZipArchiveEntry entry = zip.CreateEntry(fileName);
                    using (BinaryWriter writer = new BinaryWriter(entry.Open())) {
                return zipStream;

        public static byte[] WriteToBytes<T>(string fileName, List<T> entities) {
            return WriteToStream(fileName, entities)?.ToArray();

        public static void WriteToFile<T>(string fileName, List<T> entities) {
            using (var ms = WriteToStream(fileName, entities)) {
                var writer = new StreamWriter(ms);

                ms.Seek(0, SeekOrigin.Begin);

                using (FileStream fs = new FileStream(fileName, FileMode.OpenOrCreate)) {

        public static List<T> ReadFromFile<T>(string fileName) {
            var entities = new List<T>();
            using (var file = File.OpenRead(fileName))
            using (var zip = new ZipArchive(file, ZipArchiveMode.Read)) {
                foreach (var entry in zip.Entries) {
                    using (var stream = entry.Open()) {
                        using (var reader = new BinaryReader(stream)) {
                            entities.AddRange(JsonConvert.DeserializeObject<List<T>>(reader.ReadString(), GetSerializerSettings()));
            return entities;

Finally, to really utilize this process, I added new way to configure the use of those exported client DTOs by the player client and a way to configure the use of the exported server domain models by the servers. With that most of the work involved testing/refactoring/fixing issues.


I've already listed a handful of the challenges above, but some of the other tough issues worth calling out:

  • Serializing C# objects using .NET Core and deserializing them on .NET Framework
  • Creating reusable React components that fit the case for every entity (Was well worth the effort though! Takes a few minutes to plug in any new entity now)
  • Adding both pagination & filters, including both inclusion and exclusion filters that don't expose a SQL injection attack.
  • Separating a test vs a prod environment. Everything thus far has had one environment and needed configuration for a "production" mode. Secrets had to be moved to secure locations 😉
  • Dealing with bitmasks on the GUI. For context, the slots an item can be equipped to, the race an item can be equipped by, the class an item can be equipped by, etc are all configured using bit masks.
  • Making sure all the entities are named correctly and collect the proper input in the right format.

I want to talk about the last two a little bit more in depth.

The last challenge there is very important because this is now productionalized. We're taking backups of the database, and the idea is for this managed content to be the actual data that is released in the game's final version. However, now the entities are tightly coupled to the GUI. If we change the data from a float to an int.. and now expect 10x more for a particular value.. we need to manually convert all of the old submissions to the new value.

Right now we don't have a way to handle the above converts, but we can easily handle renames/additions by manually altering the table before deploying the new web application and web service. Once we need to start making drastic alterations, the plan is to add a new field.. "version" to each entity as well as a way to convert from one version to the next. Then, any time a version is loaded by the service, it can quickly convert to the newest version of the fly.

The second to last challenge, bit masks, are worth calling out because of how powerful they have been in terms of reducing our payload's overhead. Instead of having a ton of fields to designate "fitsInChestSlot" (1 byte), we have a 32 bit single integer (4 bytes) to represent 22 fields.

using System;

namespace RPG.Common.Inventory.Types.Slots {
    public enum EquipmentSlotType : uint {
        None = 0,
        LeftEar = 1,
        Head = 2,
        Face = 4,
        RightEar = 8,
        Neck = 16,
        Shoulder = 32,
        Arms = 64,
        Back = 128,
        LeftBracer = 256,
        RightBracer = 512,
        Range = 1024,
        Hands = 2048,
        Primary = 4096,
        Secondary = 8192,
        LeftRing = 16384,
        RightRing = 32768,
        Chest = 65536,
        Legs = 131072,
        Feet = 262144,
        Waist = 524288,
        Ammo = 1048576

We can easily convert the 32 bit integer to and from a list of EquipmentSlotTypes. The below simply converts a bitmask, 0000000010001, for example, into a list of the "enabled" bits.

public static List<EquipmentSlotType> MaskToList(EquipmentSlotType mask) {
  return EquipmentSlotTypes
    .Where(slot => (slot & mask) == slot)

Whats Next

Well, we have a few things to address on the content manager flow:

  • Simplify factions by having editors choose "Dubious" vs "Scowls", for example, rather than needing to know the exact number of faction. Include references of faction values in the case that a string representation doesn't make sense.
  • Mobile friendly (mostly done via bootstrap but there seems to be something messing with the viewport..)
  • Error validation for non numeric/string fields
  • Filter out already included entities in lists where it doesn't make sense to add the same item twice.

That should really finish up the experience when using the tool, which is already pretty good for the time put into it.

However, with 7 weeks of working on something that just felt like something I'd do at work... I'm going to be spending a few weeks knocking out the last bit of things on our roadmap before I transition to mostly content (building zones, rigging equipment, adding npcs, adding audio). I plan to spend at least half of the year adding equipment and really bringing life to the game. I'd like to have a really good "demo" test that could be performed by 10-15 people by the end of the year.

More updates to come. If you have any questions on the specifics from this post, feel free to start a new conversion in our forums!

What did you think about this post?

Click on a star to rate it!

Average rating 0 / 5. Vote count: 0

No votes so far! Be the first to rate this post.

Be the first to comment

Leave a comment

Your email address will not be published.