Attack Your Server Less Often
Overview
The responsiveness of a web site depends on many factors. If your website interacts with a database you have a huge responsibility to reduce the number of times you access your database.
Consider a number of things regarding your site's design:
How often do you (or must you) "hit" the database per user, per form?
Are you sharing common data between users or are you making each user request the same data?
Are you pulling all columns from a table or just those that you need?
One should never query or update a table more often than they have to, and never transmit more data than necessary over the wire. We will explore a couple of techniques for reducing your database activity and thus improving your user experience while reducing the activity on your server. These techniques will not apply to all situations.
Load and cache common data
ASP.NET has a very powerful Cache mechanism. It is not to be abused, but it certainly can be used for smaller sets of commonly used data.
Many times there are sets of data (e.g., State codes and States) that rarely change and are common to all users. These can be queried and stored in Cache such that subsequent user requests will access the cached version rather than causing a database read. The code must be written such that if the cached data does not exist, the data will be read and stored.
(The class function below uses two other functions [not shown], ReturnLiveConnection and ReturnDataTable to handle the mundane aspects of building a DataTable, and a stored procedure would be more efficient than the dynamic SQL I use here).
The point here is that the data is stored in Cache (for 100 minutes), and if the data is not there the query is run and then the data is stored for the next request. Since States are seldom added (or deleted!) I did not bother to set this Cache up to refresh itself if the source data table was modified. In this example the data will drop out of the Cache in 100 minutes, which is just an arbitrary number that I chose.
Pulling the States from Cache takes a fraction of the time that hitting the database does. Multiply that by the number of your users and you have a substantial activity savings.
' get State codes and descriptions
'----------------------------------------------------------
Public Shared Function GetStates() As DataTable
Dim dt As DataTable
' see if the data table exists in the Cache
dt = CType(HttpRuntime.Cache("STATES"), DataTable)
' nope, query and store
If dt Is Nothing Then
Dim sql As String = "select Code, Desc from States (nolock)"
dt = Util.ReturnDataTable(sql, CommandType.Text, Util.ReturnLiveConnection)
' put in Cache, expires after 100 minutes of non-use
HttpRuntime.Cache.Insert("STATES", dt, Nothing, DateTime.MaxValue, TimeSpan.FromMinutes(100))
End If
Return dt
End Function
Batch Updates
This tip can be used in certain situations where the user has data that is only accessible to them, is "reasonable" in size, and is not critical. Let me explain: if a user's data is limited to a reasonable number of rows and columns and is such that storing it in Session is "safe enough", then this technique may be used.
If a user owns a number of rows and will update them several times during their website visit, sending updates, inserts, and deletes to the server and then re- querying the server each time they make an edit is not efficient. A main benefit of the disconnected data model is that you can connect to the database, get your data, disconnect, operate on that data, and then at some point push your changes back to the server.
This solution is not for every application. I would not use it for banking or other critical scenarios. I would not use it when more than several dozens of rows are being pushed back to the server. But in the right situation this solution may fit the bill for you and it will save you many database accesses.
So we will:
Query the data and store it in a dataset.
Store the DataSet in Session.
Allow repeated updates, deletes, and inserts to the dataset without saving to the database after each change. The dataset is stored back into Session after changes.
When the user has completed all of their modifications, push changes back to the Server.
A couple of design issues in my code to be aware of:
This code started out in .NET 1.1 and was migrated to 2.0.
Most of my pages inherit from a base Class (which itself inherits from System.Web.UI.Page); so when you you see MyBase.GetUserSession, the code behind of the web page is calling the base class method.
I use a class, AUser, to hold user data. In this example the class instance is stored in Session and is used by multiple pages.
The web form uses a SqlDataAdapter (which I normally stay away from but need it here for the batch update of the dataset), associated cmdSelect, cmdInsert, cmdUpdate, and cmdDelete controls, and the configuration of the SQL that goes into all of this mess.
Some code has been removed for readibility.
A snippet from AUser.vb:
' this holds info gathered about the user at login time
'----------------------------------------------------------
Public Class AUser
Dim _messageDS As DataSet
Dim _UserId As Integer
Dim _Current_MId As Integer
Public Property messageDS() As DataSet
Get
Return _messageDS
End Get
Set(ByVal Value As DataSet)
_messageDS = Value
End Set
End Property
Public Property UserId() As Integer
Get
Return _UserId
End Get
Set(ByVal Value As Integer)
_UserId = Value
End Set
End Property
Public Property Current_MId() As Integer
Get
Return _Current_MId
End Get
Set(ByVal Value As Integer)
_Current_MId = Value
End Set
End Property
End Class
A snippet from the form's base class:
' retrieve AUser instance from Session
'----------------------------------------------------------
Protected Function GetUserSession() As AUser
Dim userSession As AUser = CType(Session("USER"), AUser)
Return userSession
End Function
Form level vars of WebForm:
Dim ds As New DataSet
Added by Form Designer to WebForm:
Me.sqlConn1 = New System.Data.SqlClient.SqlConnection
Me.cmdInsert = New System.Data.SqlClient.SqlCommand
Me.cmdUpdate = New System.Data.SqlClient.SqlCommand
Me.cmdDelete = New System.Data.SqlClient.SqlCommand
Me.daMessages = New System.Data.SqlClient.SqlDataAdapter
OK, let's take a look a code in the Web Form that uses this technique.
Loading up the DataSet (such as in Form_Load):
' Get user's data into form var ds, table "Messages"
'----------------------------------------------------------
Private Function FillDataset() As DataSet
If thisUser Is Nothing Then thisUser = MyBase.GetUserSession
If thisUser.messageDS Is Nothing Then
' clear table out of form var ds
If ds.Tables.Contains("Messages") Then ds.Tables("Messages").Clear()
daMessages.SelectCommand.Parameters("@A_Id").Value = thisUser.UserId
daMessages.Fill(ds, "Messages")
thisUser.messageDS = ds
Session.Add("USER", thisUser)
End If
Return thisUser.messageDS
End Function
'----------------------------------------------------------
' Ensures form var ds is loaded, optionally bind to grid
'----------------------------------------------------------
Private Sub Find_and_Bind(bind as Boolean)
Dim dv As DataView
' if the form level var ds tablecount is 0
' load the dataset from Session to this local variable
If ds.Tables.Count = 0 Then ds = FillDataset()
If Not bind Then Exit Sub
' I use the view for binding to the grid
dv = ds.Tables(0).DefaultView
If dv.Count > 0 Then
grdMessages.DataSource = dv
grdMessages.DataKeyField = "M_ID"
grdMessages.DataBind()
End If
End Sub
What we have done here is to pull the user's messages out of Session, or, if it is not there yet, the database is queried and then it is stored in Session. This dataset stored in Session will contain any cumulative edits that have not yet been pushed to the server, and if the dataset does contain any unsaved changes we can detect this and display a message indicating this to the user as well as a "push these changes to the server" button.
A typical Edit scenario would be for the user to select a row from the dataset, displayed in our grid, to be edited:
Dim dv As DataView
Dim drv As DataRowView
Dim m_Id As Integer
' this fills form variable ds
Find_and_Bind(False)
' the dataview the grid is bound to
dv = ds.Tables(0).DefaultView
' get the DataRowView being edited, and retrieve the MessageId
drv = dv(e.Item.ItemIndex)
m_Id = CInt(drv("M_ID"))
' update our AUser instance, indicating M_Id being edited
thisUser.Current_MId = m_Id
' navigate to form to edit single row
Response.Redirect("Edit.aspx?mode=E&rurl=messages.aspx")
End Sub
Here is a typical Insert scenario:
dt = thisUser.messageDS.Tables(0)
' build a new, empty datarow (it will be in the datatable's "schema")
Dim drEdit As Datarow = dt.NewRow
' populate the account id field
drEdit("A_ID") = thisUser.UserId
' build a fake MessageId value, for now.
' Since M_ID is an auto-inc field, when
' we post the dataset to the server
' it will be generated by the system.
' the fake M_Id here in the dataset
' is for my convenience.
Dim i As Integer = -1 * (dt.DefaultView.Count + 1) ' fake row id
drEdit("M_ID") = i
' Populate the other fields of new row
And now our Delete scenario:
Dim dt As DataTable
Dim dr As DataRow
' this fills form variable ds
Find_And_Bind(False)
dt = ds.Tables(0)
dr = dt.Rows(e.Item.ItemIndex)
' Delete the item selected in the DataGrid from the data source.
If Not dr Is Nothing Then
dr.Delete()
End If
' update Session var, rebind Grid since we removed a row
thisUser.messageDS = ds
Find_And_Bind(True)
End Sub
We have somewhat crudely covered the basic edit, insert, delete activity against a DataTable which lives inside our DataSet. It is part of our AUser instance stored in Session. There is one helpful step to have when refreshing our form that has the datagrid the table is bound to:
' if our dataset has changes, display a label indicating
' that unsaved changes exist and to please push them
' to the server before they log out. '----------------------------------------------------------
Private Sub SetHasChangesControls()
lblChanges.Visible = ds.HasChanges
btnSaveDB.Visible = ds.HasChanges
End Sub
Now it's time to push the changes in our dataset back to the server:
Dim bHasAdds As Boolean = False
' this fills form variable ds
Find_And_Bind(False)
Try
bHasAdds = ds.HasChanges(DataRowState.Added)
' we finally use the power of the SqlDataAdapter!
If ds.HasChanges Then daMessages.Update(ds.Tables(0))
' if I have Inserts I need to refresh dataset ds to
' pick up primary key M_ID that was added when the new
' row was, so that my dataset is up to date
If bHasAdds Then
thisUser.messageDS = Nothing
ds = FillDataset()
End If
SetHasChangesControls()
Catch ex As Exception
Dim s As String = ex.Message
lblMsg.Text = s
lblMsg.Visible = True
End Try
End Sub
This technique is not new, as ADO had the BatchUpdate method. .NET has another layer with the SqlDataAdapter, but the effect is the same. The SQL that is used with the command objects to do Insert, Update, and Delete actions is usually not fun to set up, so there is a price.
However, in those situations where this technique is a good fit, you can save a lot of over-the-wire activity and database activity by sending a user's accumulated changes once to the back-end, but the user will need to be the one (normally) to initiate the save.
Accessing the database is one of the costliest operations you can have, in any application. Hopefully these two techniques can help you reduce the number of times you have to.
Steve