Dynamic Method Binding in Delphi

This article provides detailed information on Data Abstract's Dynamic Method Binding. The concepts are illustrated by considering the Fetch sample shipped with Data Abstract for Delphi. We show you how you can use it to stream data in a variety of ways to maximize the user experience. At the end of this article we also show how to send custom SQL commands to the server in order to return dynamic row sets.

Introduction

Client/server applications are known to not be the optimal choice when dealing with a large number of users and for working across the Internet. While multi-tier applications solve both of those problems, they also introduce new ones.

The hardest one to deal with, when moving from the client/server world, is probably related to fetching data. For instance, what will happen if a query returns 100,000 rows and the client is connected to your server using a 56k dial-up connection?

While in most cases problems like this only arise from an inappropriate user interface design, it's also possible that the connection used is just too slow to support even a minimal set of records. Users don't like to wait 5 or 10 seconds after they press a button, and this might ultimately affect your business.

Current Solutions

Frameworks like ADO.NET promote an all-or-nothing approach: you send all the records in one shot, including both the master and detail tables. While this might simplify development and greatly enhance scalability, it's definitely not the most appropriate solution for all cases. What if you don't know what you are looking for (e.g. trying to find an order that was processed last year) and your result set contains half a million records?

Frameworks like DataSnap provide a number of other options, such as details-on-demand or paging. While this improves things quite a bit, the implementation of IAppServer and TClientDataset makes it difficult to optimize database requests and it requires a stateful connection between the client and server. This dramatically affects the scalability of your system and quickly drains resources on the server. There are ways around this, but they require manual coding and a good knowledge of the framework.

Data Abstract's approach

Data Abstract's Dynamic Method Binding avoids these problems and provides a data streaming solution that fits your business problems and design taste.

Data Abstract supports incremental (or paged) fetch for large row sets, details-on-demand, all-in-one-shot and combinations of the above (e.g. master page and details on demand). All this in a completely stateless fashion, allowing you to use the specific RDBMS dialect/engine at its best.

A Practical Illustration

As stated above, we will show extracts from the Fetch sample shipped with Data Abstract for Delphi. First, screenshots of the application are shown, followed by extracts from the source showing how the functionality has been achieved.

Note: if you have installed Data Abstract (full or trial version), you can follow the steps below yourself to test the sample on your system. If you haven't, please ignore the bulleted instructions below - the article is written so that you do not need to install in order to understand it.

  • Compile both server and client and then launch them.

The Paged Orders tab shows how the combination of "paged fetching" and "details on demand" works. The application reads 40 orders at a time together with the associated detail records every time you move to a new order. This approach provides the users with immediate results and is ideal for dealing with large datasets.

A potential draw back is increased network traffic because of the number of (small) requests sent to the server every time. However, this is mitigated by caching all records received by the client.

  • This is how the form looks after you click the Open/Close button:

Pay close attention to the scroll bar on the right. The application is not aware of how many records are present server side, so it assumes that what it has is all there is. After scrolling down and hitting the last record in the current batch, the client issues a new request to the server to check if there are more records available. It will keep doing so till the server responds with less than 40 records (which indicates that we have reached the last page).

This is how the client will look after the final fetch:

As you can see, the top scroll bar on the left is much smaller now.

And what about the Order Details? Well, they are only fetched when required and, depending on how you reached the last record (by pressing the Page Down or the Down key), it either fetched a minimal number of detail records or all the details for each master.

  • Now click the Fetch All tab and click Open/Close.

This is probably the best approach when you know you don't expect too many records (e.g. 200) and want to provide the users with good filtering capabilities. If used correctly, it's fast and puts little overhead on the network. If the number of rows is too big, however, it might negatively affect performance and user-experience.

This is what you will presented with:

If you scroll down (regardless of how you do it), you will notice the scroll bar doesn't change in size. This is simply because we already have all the records client side and no more fetching is required.

Tip: If you want, you could save these records on the client computer and use them offline in a briefcase fashion. DataTables can maintain a delta of changes which will be saved along with the data. Deltas are maintained if the TDADataTable.LogChanges property is set to True.

See the Briefcase sample shipped with Data Abstract for a working example.

This concludes the brief overview of the application's functionality. We now discuss the code needed to achieve it.

The code: server side

The server exposes only one service which is declared as follows:

IFetchService = interface(IDataAbstractService)
    ['{E70F3537-38F1-4C4E-AB64-6935C9C9CF69}']
    function GetOrdersAndDetails(const StartOrderID: Integer;
                                 const EndOrderID: Integer): Binary;
  end;

This is a completely arbitrary interface that was specifically defined for the Fetch sample. Data Abstract doesn't force you to use pre-determined interfaces or lock you in a "standard" way of doing things, even though it does provide a basic TDataAbstractService implementation that you can choose to use/inherit services from.

In Data Abstract you define what "standard" means and design the interfaces you want to work with.

Let's take a look at the GetOrdersAndDetails method in detail. It returns all the orders between and including StartOrderID and EndOrderID together with their associated detail records:

function TFetchService.GetOrdersAndDetails(const StartOrderID: Integer;
                                             const EndOrderID: Integer): Binary;

  procedure ProcessDataset(aDataset: IDADataset);
  begin
    aDataset.ParamByName('StartingOrderID').AsInteger := StartOrderID;
    aDataset.ParamByName('EndingOrderID').AsInteger := EndOrderID;
    aDataset.Open;
    DataStreamer.WriteDataset(aDataset, [woRows], -1);
  end;

begin
  result := Binary.Create;
  DataStreamer.Initialize(result, aiWrite);
  try
    // writing orders
    ProcessDataset(Schema.NewDataset(Connection, 'Orders'));
    // writing order details
    ProcessDataset(Schema.NewDataset(Connection, 'OrderDetails'));
  finally
    DataStreamer.Finalize;
  end;
end;

Notice how both datasets are streamed using one single stream. This is possible because Data Abstract data streamers support the streaming of multiple datasets and deltas in one batch to minimize network round trips. This feature is extremely important when the client sends updates that need to be applied in the same transaction or, as in this case, you want to minimize server round trips.

The code: client side

On the client side, we make extensive usage of DataTables and Dynamic Method Binding. The client form contains 4 DataTables: an orders and order details pair for each fetch method.

This is how the client form looks:

Paged Orders

The two DataTables for this page are tbl_PagedOrders and tbl_PagedOrderDetails. They have standard properties which were created by a project wizard. A master/detail relation was created using the mmWhere MasterMappingMode:

The paging is controlled using the DataTable's OnAfterScroll event, which was coded as follows:

procedure TFetchClientMainForm.tbl_PagedOrdersAfterScroll(
  DataTable: TDADataTable);
var
  lrecordcount: integer;
begin
  { This is the core of the paging process. When we reach the last record in
    the current set we want to issue a new request to the server to get the
    next packet. If the requested records is different than the returned amount
    it means there's nothing more to fetch and we're done with the paging.

    By doing paging this way we have full control on what happens server side
    and what SQL statements are generated. This is only one of the many possible
    paging implementations you could do using Data Abstract.
    The control is in your hands! }

  if fFetchRequired and DataTable.EOF and not DataTable.Fetching then
    with DataTable do begin
      ParamByName('StartingOrderID').AsInteger := FieldByName('OrderID').AsInteger;
      MaxRecords := seMaxRecords.Value;
      lrecordcount := DataTable.RecordCount;
      LoadFromRemoteSource(True);
      lrecordcount := RecordCount-lrecordcount;

      lbFetchedRecs.Caption := IntTostr(lrecordcount) + ' records retrieved';
      fFetchRequired := lrecordcount = MaxRecords;
      if not fFetchRequired then
        MessageDlg(Format('Done fetching!!! Expected %d rows but only got %d ',
                          [MaxRecords, lrecordcount]), mtInformation, [mbOK], 0);
    end;
end;

Fetch All

The two DataTables for this page are tbl_Orders and tbl_OrderDetails. This is a snapshot of daFetchAll's property values:

The GetDataCall property points to a different method this time: GetOrdersAndDetails. As previously shown, this method returns the records for both tables in one server call.

Note: You can assign any method and our wizard will recreate all the parameters for the method. In this example, all parameter values were set by the wizard.

Two DataTable settings are all that are needed to provide the "all in one fetch" capability for master/detail records:

  • tbl_OrderDetails has RemoteFetchEnabled set to False.
  • tbl_Orders has the moAllinOneFetch option set.

There isn't a need for the detail dataset to issue remote calls, since the master does it.

This is the code that gets executed when the "Open/Close" button on this tab is pressed:

procedure TFetchClientMainForm.bFetchAllClick(Sender: TObject);
begin
  { Sets the filtering options for the server method so we only retrieve the
    selected records. It's good advice to always filter data like this to
    minimize network trafic.

    The server method is defined as:

      function GetOrdersAndDetails(const StartOrderID: Integer;
                                   const EndOrderID: Integer): Binary;

    The Binary returned by the server contains a stream with all the orders
    and all the details }

  daFetchAll.GetDataCall.ParamByName('StartOrderID').AsInteger := seStart.Value;
  daFetchAll.GetDataCall.ParamByName('EndOrderID').AsInteger := seEnd.Value;
  tbl_Orders.Active :=not tbl_Orders.Active;
end;

The only thing required was to set the range of Order records to be retrieved from the server.

Custom SQL Queries

Another practical and often needed example for the use of Dynamic Method Binding can be seen in the "Dynamic SQL" sample shipped with Data Abstract for Delphi.

This example shows how to send an SQL select command to the server and receive back a row set with its schema (list of fields and their attributes).

This client screen shot was taken after the server executed the query typed in the memo box:

The server exposes a service defined as follows:

IDynSQLService = interface(IDataAbstractService)
    ['{02F71273-9E5C-4BD7-81EF-3BD4663EA0AB}']
    procedure MyUpdateData(const Delta: Binary);
  end;

The client accesses service methods with the following calls:

procedure TDynSQLMainClientForm.RetrieveSchemaClick(Sender: TObject);
begin
  InitRDA(True, 0);
  DataTable.Close;
end;

procedure TDynSQLMainClientForm.RetrieveDataClick(Sender: TObject);
begin
  InitRDA(False, seMaxRecs.Value);
end;

procedure TDynSQLMainClientForm.RetrieveSchemaAndDataClick(Sender: TObject);
begin
  InitRDA(True, seMaxRecs.Value);
end;

procedure TDynSQLMainClientForm.SaveClick(Sender: TObject);
begin
  DataTable.SaveToFile(ExtractFilePath(Application.ExeName) + 'Data.dat');
end;

procedure TDynSQLMainClientForm.LoadClick(Sender: TObject);
begin
  DataTable.LoadFromFile(ExtractFilePath(Application.ExeName) + 'Data.dat');
end;

procedure TDynSQLMainClientForm.UpdateClick(Sender: TObject);
var
  deltadata: Binary;
begin
  deltadata := Binary.Create;
  try
    // Packs the delta of the data table and sends it over.
    DataStreamer.Initialize(deltadata, aiWrite);
    DataStreamer.WriteDelta(DataTable);
    DataStreamer.Finalize;

    (svcDynSQLService as IDynSQLService).MyUpdateData(deltadata);
  finally
    deltadata.Free;
  end;
end;

procedure TDynSQLMainClientForm.DataTableBeforeRefresh(DataTable: TDADataTable);
begin
  DARemoteDataAdapter.GetDataCall.ParamByName('aSQLText').AsString :=
                                                                Memo.Lines.Text;
end;

procedure TDynSQLMainClientForm.InitRDA(aIncludeSchema: Boolean;
  AMaxRecords: integer);
begin
  with DataTable do begin
    Close;
    DARemoteDataAdapter.GetDataCall.ParamByName('aSQLText').AsString :=
                                                                Memo.Lines.Text;
    MaxRecords := AMaxRecords;
    DARemoteDataAdapter.Fill([DataTable], true, aIncludeSchema);
  end;
end;

The following screenshot shows the RemoteDataAdapter class's property values:

Conclusion

Dynamic Method Binding provides a truly object oriented way to fetch data and, at the same time, increases design elegance because it does not restrict data-fetching to predefined (and often rigid) method calls.

DataTables are capable of fetching data in a variety of ways providing a solution to almost every possible scenario that you might face when streaming data, from tiny to huge record sets.

By combining these, you can design services and clients that are easy to code and maintain.