Update 09/22/2021: Added a link to a gist to generate C# Entities. See below.

Update 06/18/2021: EDMX support will be integrated in a custom tool.

Update 02/26/2021: EDMX is back in the game 😉

Update 06/30/2020:  New lesson learned: When your database longs overflow Number.MAX_SAFE_INTEGER.

Update 10/04/2019: Added Lessons learned during development section. See Below.


Update 09/26/2019: Added a link to a gist to generate Typescript Entities. See below.


I’m currently helping one of my customers with a n-tier Entity Framework powered ASP.NET Core web service and  Angular 7.x application.

When it comes to real life web apps, on the client Entity Framework available tooling was somehow limited. By real life application, I mean sending an object graph for manipulation from the server to the client, then sending a modified version back to the Web API layer.

The service layer then should apply the incoming changes to the database. The core idea is not to read existing records in a new context, apply changes and save them. Rather, it is to apply changes directly to a new context, saving to the database with optimistic concurrency.

The Old New Thing

Microsoft introduced the concept of Self Tracking Entities, before shortly ending its support; now it is in favor of a community based alternative, Trackable Entities. The core concept remains the same, you send a disconnected graph to the client, changes are tracked there, the graph is sent back to the service layer, and the appropriate changes are applied to a new EF context then saved to the database.

But Wait, Trackable Entities was in fact quickly deprecated, but firstly in favor of Odata – for interoperability reasons – which, at the time I looked at it in 2012 – didn’t have the notion of an object graph. I honestly don’t know if this is the case right now.

With disconnected entities, the idea is to have a single round trip to the server for obvious performance reasons : ping + security handling + (de)serialization.

Interestingly, there has been a debate over that kind of solution. I have to say that, like Tony Sneed, change tracking on the client is a perfectly valid scenario I you own both the server and the client side. It is always a matter of context. I used it with a lot of success despite the hurdle of potentially introducing duplicate objects in the graph on the client.

Welcoming JavaScript to the party

Now, thanks to Trackable Entities, we have a way to generate c# entities for the service side layer, c# and TypeScript entities for the client. So we are more interoperable.

Careful though

When using JSON as the serialization format, you have to make sure that objects reference loops handling is carefully configured and objects references are preserved. You should also configure the serializer for camel casing the property names, in order to match the client entities. This is the default for ASP.NET Core Web API but not for SignalR. This is for the server side.

Here are the settings I’m using right now for sending data to the client:


new JsonSerializerSettings
                {
                    NullValueHandling = NullValueHandling.Ignore,
                    ContractResolver = new CamelCasePropertyNamesContractResolver(),
                    ReferenceLoopHandling = ReferenceLoopHandling.Serialize,
                    PreserveReferencesHandling = PreserveReferencesHandling.Objects,
                }

On the client, you have to handle the deserialization to restore these references. You should also use a deserializer that will call the constructors on rehydrated entities for the change tracking to work properly. But in order to do that, you have to tell the serializer what to do with your JSON objects.

ES2015 target

Trackable Entities must be used while targeting ES6 (ES2015). This leads to interesting challenges with circular dependencies on modules. On the other hand, Dcerializer works with TypeScript decorators.

TypeScript Decorators

Typescript decorators are heavily used in both Angular and dcerialize which is the JSON.Net compatible Serializer for Javascript.

Originally we were using Angular 7.x which relies on decorator metadata. However decorator metadata are problematic when used with ES2015.

Fortunately, neither Angular 8, nor dcerialize relies on decorator metadata (I was wrong about that at first for the latter), so we upgraded to Angular 8 to get rid of this problem.

Date serialization

dcerializer accepts ISO 8601 date strings on deserialization but produces ticks (long) on Date serialization.

The problem was quickly solved though.

ASP.NET Core custom model binder

By default ASP.NET Core use a DataContract Serializer for JSON deserialization.

Because We’re sending data to the client using JSON.Net with custom JsonSerializerSettings, we need to use JSON.NET with the same settings on deserialization. I had to write a custom model binder, shown below.


   ///
    /// Custom model binder to be used when TrackableEntities coming in HttpPost Methods.
    /// 

    public class TrackableEntityModelBinder : IModelBinder
    {
        ///
        public async Task BindModelAsync(ModelBindingContext bindingContext)
        {
            if (bindingContext == null)
            {
                throw new ArgumentNullException(nameof(bindingContext));
            }

            using (var reader = new StreamReader(bindingContext.HttpContext.Request.Body))
            {
                var body = await reader.ReadToEndAsync().ConfigureAwait(continueOnCapturedContext: false);

                // Do something
                var value = JsonConvert.DeserializeObject(body, bindingContext.ModelType, new JsonSerializerSettings
                {
                    NullValueHandling = NullValueHandling.Ignore,
                    ContractResolver = new CamelCasePropertyNamesContractResolver(),
                    ReferenceLoopHandling = ReferenceLoopHandling.Serialize,
                    PreserveReferencesHandling = PreserveReferencesHandling.Objects,
                });

                bindingContext.Result = ModelBindingResult.Success(value);
            }
        }
    }

Trackable Entities change tracker properties

There is no magic, along with the modified object graph, we should send the Trackable entities related properties.

But because we’re using Typescript decorators, we can’t set them on code we do not own. Fortunately, nothing prevents us to add them manually at the end of the serialization.

How to generate c#/Typescript entities code

You can generate dcerialize aware trackable entities   using the files in those gists (gist1, gist2).

Fork the TrackableEntities EntityFrameworkCore.Scaffolding.Handlebars repo and use my templates intead of those provided in the Typescript sample.

Lessons learned during development

dcerialize internals

For dcerialize to work properly you have to make sure that the JSON generated by the server could be processed succesfully. That means that the order of properties is very important. It must match between the server and the client. In order to do so:

  • Server side:
    • Decorate your properties with [JsonProperty(Order = X)] attributes.
    • Exclude ITrackable properties from serialization. I chose approach 1 from this SO answer. Leaving them may mess with the properties order.
  • Client side, you have to use decorators anyway but the order must match what you have server side.

Your best option is to generate code in both cases to make sure everything is OK. I still have to make a template for server side with the recommandations above.

Your database long values may be larger than JavaScript Number.MAX_SAFE_INTEGER

This one was particularly hard to figure out. We had a business decision that made some long integer (SQL BIGINT) going up to 17 digits instead of 15. We were sending them to the client to later get them back in some REST API calls.

As our big surprise, we were receiving back wrong values. How on earth could that be possible? Well after eliminating all potential problems in the server business layers, we put the lenses on the client deserialization.

One of the solution would have to be able to use bigint. But unfortunately, it seems that it is not compatible with dcerialize.

So we decided to use strings on the client, leaving longs on the server enities. I added a JSON converter server side to be used on the affected long properties.


    public class BigIntToStringConverter : JsonConverter
    {
        ///
        public override bool CanConvert(Type objectType)
        {
            return objectType == typeof(long);
        }

        ///
        public override object ReadJson(JsonReader reader, Type objectType, object existingValue, JsonSerializer serializer)
        {
#pragma warning disable CA1062 // Validate arguments of public methods
            JValue jValue = new JValue(reader.Value);
#pragma warning restore CA1062 // Validate arguments of public methods
            if (jValue.Type == JTokenType.String)
            {
                string incoming = (string)jValue.Value;

                long bigInt = long.Parse(incoming, CultureInfo.InvariantCulture);
                return bigInt;
            }

            throw new InvalidOperationException("Value is not a Javascript biginteger.");
        }

        ///
        public override void WriteJson(JsonWriter writer, object value, JsonSerializer serializer)
        {
            long bigInt = (long)value;
#pragma warning disable CA1062 // Validate arguments of public methods
            writer.WriteValue(bigInt.ToString(CultureInfo.InvariantCulture));
#pragma warning restore CA1062 // Validate arguments of public methods
        }
    }

Final thoughts

The following is slightly off topic and are my ramblings about Microsoft decisions regarding EF Core.

EDMX is not supported in EF Core

Self tracking entities were generated from an EDMX files by the help of t4 templates. I loved using Entity Data Models because you could choose what to include in your model, remove unwanted properties and/or relations that doesn’t suited your needs and you were done. Furthermore you could, as I did, add content to the XML file just for your own needs during T4 processing.

The Scaffold-DbContext reverse engineering tool allows you to generate entities and a context from an existing database but you’re left alone to remove what you don’t need in entities and furthermore in the Code First fluent API calls that configures the context.

Good luck with that. There are so many ways to misinterpret the way it works internally if you don’t know the conventions that EF Core Code First uses. In my opinion, Microsoft should have provided a way to reverse engineer from an EDMX as well, not just from an existing database that may contains lots of schema and unwanted tables/views etc…

Given the millions of developers working with RDBMS prior the birth of Entity Framework, they could have spend some time to write a Custom ADO.NET provider to target EDMX files. Windows developers would have given Microsoft a clap about that. This is a strategic mistake in my own opinion.

If I had the time I surely would do something about it, but I have not and honestly, the way to go, in my opinion,  for new projects would now be GraphQL, not Entity Framework. I’ve experienced Postgraphile in particular and I just love it.

Happy coding with Entity Framework and Trackable Entities !

Leave a Reply

Please log in using one of these methods to post your comment:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

This site uses Akismet to reduce spam. Learn how your comment data is processed.