Using tables instead of table extensions

3 Mar

Recently James Crowter wrote an excellent article about table extensions and how they affect performance. In short, table extensions are great for flexibility and ease of development, but performance decreases when the number of table extensions is adding up. Especially when table extensions are used for hot tables. With hot tables I mean tables that are used often, like Item, Customer, Sales Line, Item Ledger Entry, etc.

So I started thinking, when would you decide to not use table extensions? And how to work supplemental tables but still have the same experience for the end-user? In this blog post, I want to do some suggestions. I’m aware this can be done in many ways, so this is just a suggestion.

Considerations

Let’s first go through the process of choosing between table extension or supplemental table. Basically it should be possible to put all extra fields you have for a particular table into a supplemental table rather than a table extension. Behind the scenes, the table extension is a supplemental table anyway. The only difference, and definitely an important difference, is that the table extension appears to be one table for the developer. But with some extra effort we can do that ourselves, but what do we gain for that extra effort? Well, I guess that’s already clear, we get better overall performance because there is no automatic join. We decide in code when to read the supplemental table.

And that’s exactly what should be the main consideration when choosing between a supplemental table and a table extension. Do you need those extra fields in many places in the code? Or just at a few places? In other words, would you benefit from the automatic join of the companion table or would it be a waste of resources because you only use those fields in a few places? Besides that, you also need to look at how much the table is being used. Is it a hot table, used in many different places and a favorite table to extend? Then you should also be careful with table extensions.

The Data Table

Ok, let’s imagine I have some fields to be added to the Item. Note that I’m not saying ‘to the Item table‘. I want to the user to see those fields on the Item list and card as if they were part of the Item table but I’ve chosen to put them into a supplemental table. The first step is to create that table.

table 50100 "Item Extra Fields AJK"
{
    fields
    {
        field(1; Id; Guid)
        {
        }
        field(2; "Item No."; Code[20])
        {
            FieldClass = FlowField;
            CalcFormula = lookup (Item."No." where(SystemId = field(id)));
        }
        field(3; "Perishable"; Boolean)
        {
            trigger OnValidate()
            begin
                if Perishable then
                    "Storage Temperature" := 4
                else
                    "Storage Temperature" := 0;
            end;
        }
        field(4; "Food Category"; Enum FoodCategory)
        {
        }
        field(5; "Storage Temperature"; Decimal)
        {
        }
    }

    keys
    {
        key(PK; Id) { }
    }
}

As you can see, the table is not using the Item No. field as primary key. Instead, it uses an Id field that I want to be the same value as the SystemId of the Item record.

The Item No. field is a FlowField, so if I run the table I can see to what Item that record belongs.

The other fields are the fields that I want to add to the Item Card and Item List page. The field Perishable has code in the OnValidate trigger, and I want that to execute as normal when the user modifies that field.

The Tableextension

How are we going to work with this table? I figured it would make sense to create a table extension for the Item table with functions to read and save the supplemental table. In that way, we have one central place to get to the data and to save it. The table extension can also hold FlowFields to the fields of the supplemental table, so you can get the values directly without the need to read the supplemental table. That’s especially handy when reading values. With SetAutoCalcFields you can force to join the values from the supplemental table when you want to loop through the base table instead of reading the supplemental table for every single record.

tableextension 50100 "Item AJK" extends Item
{
    fields
    {
        field(50100; "Perishable AJK"; Boolean)
        {
            Caption = 'Perishable';
            FieldClass = FlowField;
            CalcFormula = lookup ("Item Extra Fields AJK".Perishable where(Id = field(SystemId)));
        }
        field(50101; "Food Category AJK"; Enum FoodCategory)
        {
            Caption = 'Food Category';
            FieldClass = FlowField;
            CalcFormula = lookup ("Item Extra Fields AJK"."Food Category" where(Id = field(SystemId)));
        }
        field(50102; "Storage Temperature AJK"; Decimal)
        {
            Caption = 'Storage Temperature';
            FieldClass = FlowField;
            CalcFormula = lookup ("Item Extra Fields AJK"."Storage Temperature" where(Id = field(SystemId)));
        }
    }

    var
        _ItemExtraFields: Record "Item Extra Fields AJK";

    procedure GetItemExtraFields(var ItemExtraFields: Record "Item Extra Fields AJK")
    begin
        ReadItemExtraFields();
        ItemExtraFields := _ItemExtraFields;
    end;

    procedure SetItemExtraFields(var ItemExtraFields: Record "Item Extra Fields AJK")
    begin
        _ItemExtraFields := ItemExtraFields;
    end;

    procedure SaveItemExtraFields()
    begin
        if not IsNullGuid(_ItemExtraFields.Id) then
            if not _ItemExtraFields.Modify() then
                _ItemExtraFields.Insert(false, true);
    end;

    procedure DeleteItemExtraFields()
    begin
        ReadItemExtraFields();
        if _ItemExtraFields.Delete() then;
    end;

    local procedure ReadItemExtraFields()
    begin
        if _ItemExtraFields.Id <> SystemId then
            if not _ItemExtraFields.Get(SystemId) then begin
                _ItemExtraFields.Init();
                _ItemExtraFields.Id := SystemId;
                _ItemExtraFields.SystemId := SystemId;
            end;
    end;
}

As you can see, the FlowFields are based on the value of the SystemId field, linked to the Id field of the supplement table. In the function ReadItemExtraFields these values are set in case the record does not exist.

The current record of the corresponding supplemental table is stored in a global variable in the table extension. With a get and set function, it is possible to read or set the actual values.

The function SaveItemExtraFields is used to really save the actual values back to the supplement table in the database. In this approach, not all Item records do automatically have a corresponding record in the supplemental table, so I use the construct if not Modify then Insert. The Insert uses the second parameter to tell the platform to not create a new SystemId but to use the value that we already set in the function ReadItemExtraFields. Of course this only works correctly if you follow the flow of these functions:

  • GetItemExtraFields
  • SetItemExtraFields
  • SaveItemExtraFields

If you would only use SetItemExtraFields and did not properly initialize the record with the Id values, then this is going to fail.

Usage on an editable Page

Let’s have a look at how this works on the Item Card page.

pageextension 50101 "Item Card AJK" extends "Item Card"
{
    layout
    {
        addafter(Item)
        {
            group(FoodDetails)
            {
                Caption = 'Food Details';
                field(PerishableAJK; ItemExtraFields.Perishable)
                {
                    Caption = 'Perishable';
                    ApplicationArea = All;

                    trigger OnValidate()
                    begin
                        ItemExtraFields.Validate(Perishable);
                        SetItemExtraFields(ItemExtraFields);
                    end;
                }
                field(FoodCategeryAJK; ItemExtraFields."Food Category")
                {
                    Caption = 'Food Category';
                    ApplicationArea = All;

                    trigger OnValidate()
                    begin
                        SetItemExtraFields(ItemExtraFields);
                    end;

                }
                field(StorageTemperatureAJK; ItemExtraFields."Storage Temperature")
                {
                    Caption = 'Storage Temperature';
                    ApplicationArea = All;

                    trigger OnValidate()
                    begin
                        SetItemExtraFields(ItemExtraFields);
                    end;
                }
            }
        }
    }

    var
        ItemExtraFields: Record "Item Extra Fields AJK";

    trigger OnInsertRecord(BelowxRec: Boolean): Boolean
    begin
        SaveItemExtraFields();
    end;

    trigger OnModifyRecord(): Boolean
    begin
        SaveItemExtraFields();
    end;

    trigger OnClosePage()
    begin
        SaveItemExtraFields();
    end;

    trigger OnAfterGetCurrRecord()
    begin
        GetItemExtraFields(ItemExtraFields);
    end;
}

The supplemental table is stored in a global variable on the page and read in the OnAfterGetRecord trigger. The fields are then displayed on the screen with source expression pointing to the respective fields. That makes sure the values are directly stored in the record. But now we are getting to some details that I think developers are not going to like much. Because it is different from how we are used to working with table fields, we need to do some extra stuff.

First of all, you need to specify the caption again. That means we now have the same caption three times: in the supplemental table, on the FlowField and now also on the Page. They will end up as three different captions inside the .xlf file and they all need to be translated. So we must make sure to keep the captions and the translations are aligned across all places we use these fields.

The other point is that the OnValidate trigger is not executing automatically. The value will be stored in the field of the global record variable, but it is treated as a global variable and not as a field with an OnValidate trigger. Hence, we have to explicitly validate the field from code on the page.

I have chosen to save the current record value back to the base table immediately. It would also be possible to do so from the OnModify trigger, but better safe than sorry.

Usage on a read-only Page

Displaying the fields on the Item List page is just a matter of adding the FlowFields. They support sorting and filtering, so the user would not even notice that these fields come from a supplemental table anyway.

pageextension 50100 "Item List AJK" extends "Item List"
{
    layout
    {
        addafter(Description)
        {
            field("Perishable AJK"; "Perishable AJK") { ApplicationArea = All; }
            field("Food Category AJK"; "Food Category AJK") { ApplicationArea = All; }
        }
    }
}

Deleting the record

The final part would be a delete trigger. For that, I suppose to use an event subscriber rather than using the page triggers. In the most simple way, that would look like this:

codeunit 50100 "Item Subscribers AJK"
{
    SingleInstance = true;

    [EventSubscriber(ObjectType::Table, Database::Item, 'OnAfterDeleteEvent', '', false, false)]
    local procedure OnDelete(var Rec: Record Item)
    begin
        Rec.DeleteItemExtraFields();
    end;
}

Final thoughts

There are other approaches possible as well. Like using the OnInsert event to always create a corresponding record in the supplemental table and assume that record is always available. That would require an install and upgrade procedure as well to sync the tables during installation or upgrading of the app.

This is by far not as convenient as using table extensions, I have to admit that. And I’m not very proud of this solution. But it works and with this little effort, we can avoid the heavy load of multiple table extensions. As James said, we should use great power responsibly. So, if writing some extra code is the price we have to pay for getting a system that performs better, so be it.

I hope Microsoft will work on other solutions in the meantime. Which would not be easy. Think about enabling joins from table extensions on the fly, as we do with SetAutoCalcFields for FlowFields. It would still be hard to cover scenarios where tables are passed through an event. They will then probably not contain the joined data, so we still need to get them. Maybe similar to CalcFields? What do I know…

All code that is demonstrated here can also be found on GitHub.

15 thoughts on “Using tables instead of table extensions

  1. There are some problems with your solution:
    -primary key is not the item no.: if you read a table with the item no., and want to get an extra field, you first need to read the item table to get the Id.
    -how you do this is correct: you NEED to save every change directly in the onvalidate. The onmodify won’t be triggered if you don’t change a field in the item table itself. At least, this was the case some NAV versions ago, but I think it still is the case because of performance.
    -instead of using the table delete event, it is better to add a modify trigger in the item table extension. At least, that code stays together with the table. Otherwise, you have another object for it. This is not for performance, but for debug/fix/maintain code reasons. Extensions make this a lot more difficult: there may be multiple places where some code is run.

    • Thanks for the feedback!

      I’ve chosen to use the SystemId instead of No. field because it’s an immutable key. With that, I wanted to avoid the hassle of renames. In case you need to read the value of an extra field, then just read the Item record. The FlowField will give you the value. With SetAutoCalcField you can indicate that you want the FlowField value to be included, so you don’t have to do an extra CalcFields which would result in an extra SQL query. Tableextensions don’t need the extra SetAutoCalcFields, but that’s the whole problem, right? So, in case Microsoft comes up with a solution, I would expect it is something similar, indicating in code that you want to include tableextension fields.

      You are right, if you don’t change any field on the record itself, the OnModify on the Page will not be triggered. That’s why I’ve added an extra call to the SaveExtraItemFields in the OnClosePage trigger. It’s little overhead, and it could even be further optimized by setting a boolean to track if a field has been modified. But I didn’t want to overcomplicate it.

      Don’t understand how a modify trigger in the item table extension solves the deletion of a record. Maybe you mean an OnDelete trigger in the table? Unfortunately, that’s not sufficient. The OnDelete trigger in the tableextension only executes with Delete(true). I wanted to cover Delete(false) too, and then you have to use event triggers in Codeunits.

      • SystemID: Yes, but reading the flowfield then reads the extra table AFTER it read the original table. So no performance gain.

        Modify-trigger – delete-trigger: I mixed up modify- and delete trigger here… But about the event triggers, I forgot a moment the possibility that a dev might do a delete on the record. In that case yes, event trigger is better. Performance stays the same, code is less easy to find because it is in another file, but it does catch lazy/bad programming.

        • When using SetAutoCalcFields, the SQL query should be optimized to include the values from the related table using joins. That’s what I’ve always been told. I would say that’s no difference with joining fields from companion tables. No performance gain, but no performance loss either. Well… for read-only pages. For editable pages you need the extra read and write. Which could be probably be optimized by using the FlowFields.

  2. …. and you need additonal licence keys for the table objects. Table Exensions are free on-prem.

    • For customization range you are right. And on-prem might be less prone to the problem anyway because customers may not stack many different apps anyway. I was merely thinking about SaaS platform obviously.

  3. Did you consider the issue with using a GUID as primary Key where the Primary Key by default is clustered? Every insert might result in a resort of the table as the clustered index ‘is’ the table due to SystemID being random in its nature. One might argue that the insert in Item will also render a sort, but usually customers tend to use number series, and item numbers do not tend to be as random as the SystemID/GUID. My experience with customers is that even with a large number of customizations, add-ons and what-not, the limit of 256 JOINs is not even close. With properly indexed tables in NAV/BC, the speed is usually good. More important to keep PK’s at the beginning of the table, not random in nature, strict inc datatypes when possible and so on, more general MSSQL optimizations.

    • Yes, I’ve given that a thought. But don’t think that the system id is unsorted because it’s a guid. In fact it is a sequence, formatted as a guid. Every next system id value in the same table will be a next number instead of a random value.

  4. I’m quite curious. If you have 10 extensions as TableExt across 100k records vs 10 of this solution, what are the performance implications?

    Also – does this technically help with table locking if you have a background job that updates values in the “extension-ish” tables?

    • I haven’t done any performance testing. The goal was to come up with an alternative for tableextensions. I think we should always consider alternatives and not choose a standard solution for every situation.

      Don’t think table locking is going to be any different, basically there is no difference between tableextensions and supplemental tables.

    • We had a BC14 database (appr. 40GB data) converted to 1 AL extension. It had about 100+ tables and also extensions on tables like Customer, Sales Header, Journal Line, G/L Entry, Customer Ledger Entry. We tested some processes like creating and posting Sales Invoices, by executing the same processes on the C/AL and AL database, on the same local machine. Our results were that the processes that hit the table extension were 5-10% slower. In our case acceptable. Though, we just had 1 big extension.

  5. What about Microsoft making the extension functionality a bit smarter? What about Microsoft giving us some more control about when to do this join? Something like an extra properties on the table extension object, or an extra parameter when calling table with the extension?
    Suppose the following scenario:
    The Customer table has an extension with an extra Code field referring to an extension table.
    I’d like to show this field on the page. Of course, then this join should be performed automatically. Like flowfields are calculated only when they are shown on a page, or a calcfields is done.
    This extra code field on the Customer table is only used in a small piece of code to retrieve the settings in the extension table. What about an extra parameter on the GET to indicate that I also want my extra field? Or an extra property on the table extension, with which I can determine to only perform the join if the Customer table is used in my extension?
    Something smarter should be possible to avoid this ‘Table Extension Hell’ !?

  6. Maybe I’m completely mistaken about the inner workings of the Extensions,

    But wouldn’t the use of SetAutoCalcFields make this whole solution obsolete?
    I know you want the same experience, but wouldn’t you be better of just using the extensions in the first place?

    The point James is making, is about the Joins needed to get the complete set of data, including the extension fields.

    If we create a flowfield, and use the SetAutoCalcFields, won’t this do exactly the same? Create joins for each extension?

    James even writes:

    “We only use table extensions if the field is required on a page, even then we try to put them in a factbox and lookup from a separate table. We treat then as circumspectly as we have flow fields the last few years. Only if it’s absolutely unequivocally needed!”

  7. How well do you think, will/do Partial Records-Feature fix the performance issues recognized in Pre-BC17-Releases?

    I work for a solution partner, and back in BC14 we started migrating our solution into separate extensions (as suggested by MS back then) Which are now running, mostly on On-Prem Servers. Due to the separation, we do have many “hot” tables that are extended by different extensions installed and we realized the performance issues when it was already to late.

    We’d have to put up quite some resources, at the moment, to refactor all our extension in a way similar to your suggestion here. My question is basically: Is it still worth to put up those resources?
    We would need to DataUpgrade the data into the supplementation tables for ~200 Tables, force away the old ExtensionTables, Test everything, And have to do coordinated upgrades that Force away the extensions without Dataloss on All our customers. And we’re already understaffed for our daily business. (which is the reason we didn’t do it already)

    Now, *IF*, on the other hand, MS has implemented the Partial Record loading implicitly on pages, AND adjusts their internal code to do simular things (if there is no event subscriber with a record passed, there can’t be any table extension field needed, for example) – We’d probably waste a lot of Resources on an already solved problem.

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.