Among the very first language keywords one comes across when learning VBA, is the Dim keyword; declaring and using variables is easily the first step one takes on their journey away from the macro recorder.
About Scopes
Before we can really understand what variables do and what they’re useful for, we need to have a minimal grasp of the concept of scoping. When you record a macro, the executable instructions for you inside a procedure scope that’s delimited with Sub and End Sub tokens (tokens are the grammatical elements of the language, not necessarily single keywords), with the identifier name of the macro after the Sub keyword:
Sub DoSomething()
' executable code goes here
End Sub
Exactly none of the above code is executable, but compiling it creates an entry point that the VBA runtime can invoke and execute, because the procedure is implicitly public and as such, can be accessed from outside the “Module1” module it exists in (with or without Option Private Module). In other words the above code could tell us explicitly what the scope of the DoSomething procedure is, using the Public keyword before the Sub token:
Public Sub DoSomething()
' executable code goes here
End Sub
If we used Private instead, then Excel (or whatever the host application is) could not “see” it, so you would no longer find DoSomething in the list of available macros, and other modules in the same VBA project couldn’t “see” or invoke it either; a private procedure is only callable from other procedures in the same module.
Standard modules are themselves public, so you can refer to them from any other module in your project, and invoke their public members using the member access operator, the dot:
Public Sub DoStuff()
Module1.DoSomething
End Sub
Because public members of public modules become part of a global namespace, the public members can be referred to without an explicit qualifier:
Public Sub DoStuff()
DoSomething
End Sub
While convenient to type, it also somewhat obscures exactly what code is being invoked: without an IDE and a “navigate to definition” command, it would be pretty hard to know where that other procedure is located.
The global namespace contains not only the public identifiers from your VBA project, but also all the public identifiers from every referenced library, and they don’t need to be qualified either so that’s how you can invoke the VBA.Interaction.MsgBox function without qualifying with the library or module it’s defined in. If you write your own MsgBox function, every unqualified MsgBox call in that project is now invoking that new custom function, because VBA always prioritizes the host VBA project’s own type library over the referenced ones (every VBA project references the VBA standard library and the type library that defines the COM extension and automation model for the host application).
But that’s all going outward from a module: within a module, there are two levels of scoping: module level members can be accessed from anywhere in the module, and procedure level declarations can be accessed from anywhere inside that procedure.
Module-level declarations use Public and Private modifiers, and procedure-level ones use the Dim keyword. Dim is legal at module level too, but because Private and Public are only legal at module level (you can’t use them for procedure scope / “local” declarations), Rubberduck encourages you to use Dim for locals only.
For example a variable declared in a conditional block is allocated regardless of the state when the condition gets evaluated, and a variable declared inside a loop body is the same variable outside that loop, and for every iteration of that loop as well.
Non-Executable Statements
Procedures don’t only contain executable instructions: Dim statements, like statements with Private and Public modifiers, are declarative and do not do anything. You cannot place a debugger breakpoint (F9) on such statements, either. This is important to keep in mind: the smallest scope in VBA is the procedure scope, and it includes the parameters and all the local declarations of that procedure – regardless of where in the procedure body they’re declared at, so the reason to declare variables as you need them has more to do with reducing mental load and making it easier to extract a method by moving a chunk of code into another procedure scope. Declaring all locals at the top of a procedure often results in unused variables dangling, because of the constant up-and-down, back-and-forth scrolling that inevitably happens when a procedure eventually grows.
Const statements (to declare constant values) are also legal in local/procedure scope, and they’re identically non-executable; the same applies to Static declarations (variables that retain their value between invocations).
ReDim statements however are executable, even though they also count as a compile-time declaration – but they don’t count as a duplicate declaration, so the presence of ReDim doesn’t really justify skipping an initial Dim declaration.
Explicitness as an Option
Not only access modifiers can be implicit in VBA; the language lets you define a Variant variable on the fly, without a prior explicit declaration. If this behavior is practical for getting the job done and will indeed work perfectly fine, it’s also unnecessarily putting you at risk of typos that will only become a problem at run-time, if you’re lucky close enough to the source of the problem to hunt down and debug. By specifying Option Explicit at the top of every module, the compiler will treat implicit declarations as compile-time errors, telling you about the problem before it even becomes one.
Option Explicit has its limits though, and won’t protect you from typos in late-bound member calls, where invoking a member that doesn’t exist on a given object throws error 438 at run-time.
When to Declare a Variable
There are many reasons to declare a variable, but if you’re cleaning up macro recorder code the first thing you’ll want to do is to remove the dependency on Selection and qualifyRange and Cells member calls with a proper Worksheet object.
For example before might look like this:
Sub Macro1
Range("A10") = 42
Sheet2.Activate
Range("B10") = 42
End Sub
And after might look like this:
Public Sub Macro1()
Dim Sheet As Worksheet
Set Sheet = ActiveSheet
Sheet.Range("A10") = 42
Sheet2.Activate
Sheet.Range("B10") = 42
End Sub
The two procedures do exactly the same thing, but only one of them is doing it reliably. If the Sheet2 worksheet is already active, then there’s no difference and both versions produce identical output. Otherwise, one of them writes to whatever the ActiveSheet is, activates Sheet2, and then writes to that sheet.
There’s a notion of state in the first snippet that adds to the number of things you need to track and think about in order to understand what’s going on. Using variables, exactly what sheet is active at any point during execution has no impact whatsoever on the second snippet, beyond the initial assignment.
It’s that (global) state that’s behind erratic behavior such as code working differently when you leave it alone than when you step through – especially when loops start getting involved. Managing that global state makes everything harder than necessary.
Keep your state close, and your ducky closer, they say.
Set: With or Without?
Not being explicit can make the code read ambiguously, especially when you consider that objects in VBA can have default members. In the above snippets, the value 42 reads like it’s assigned to… the object that’s returned by the Range property getter of the Worksheet class. And that’s weird, because normally you would assign to a property of an object, not the object itself. VBA understands what it needs to do here, because the Range class says “I have a default member!” and that default member is implemented in such a way that giving it the value 42 does exactly the same as if the Range.Value member was being invoked explicitly. Because that behavior is an implementation detail, it means the only way to know is to read its documentation.
The Set keyword modifies an assignment instruction and says “we’re assigning an object reference”, so VBA doesn’t try to check if there’s a default member on the left-hand side of the assignment operator, and the compiler expects an object reference on the right-hand side, …and then only throws at run-time when that isn’t the case – but because this information is all statically available at compile-time, Rubberduck can warn about such suspicious assignments.
So to assign a variable that holds a reference to a Range object, we must use the Set keyword. To assign a variable that holds the value of a Range object, we must not use the Set keyword. Declaring an explicit data type for every variable (meaning not only declaring things, but also typing them) helps prevent very preventable bugs and subtle issues that can be hard to debug.
As SomethingExplicit
Whether Public or Private, whether local or global, most variables are better off with a specific data type using an As clause:
Dim IsSomething
Dim SomeNumber As Long
Dim SomeAmount As Currency
Dim SomeValue As Double
Dim SomeDateTime As Date
Dim SomeText As String
Dim SomeSheet As Worksheets
Dim SomeCell As Range
Using an explicit data/class/interface type, especially with objects, helps keep things early-bound, meaning both the compiler and static code analysis tools (like Rubberduck) can better tell what’s going on before the code actually gets to run.
We can often chain member calls; the Worksheets collection’s indexer necessarily yields a Worksheet object, no?
Public Sub Macro1()
ActiveWorkbook.Worksheets("Sheet1").Range("A1").Value = 42
End Sub
If you manually type this instruction, you’ll notice something awkward that should be unexpected when you type the dot operator after Worksheets(“Sheet1”), because the property returns an Object interface… which tells VBA it has members that can be invoked, but leaves no compile-time clue about any of them. That’s why the Range member call is late-bound and only resolved at run-time, and because the compiler has no idea what the members are until the code is running, it cannot populate the completion list with the members of Worksheet, and will merrily compile and attempt to invoke a Range member.
By breaking the chain and declaring variables, we restore compile-time validations:
Public Sub Macro1()
Dim Sheet As Worksheet
Set Sheet = ActiveWorkbook.Worksheets("Sheet2")
Sheet.Range("A1").Value = 42
End Sub
When NOT to Declare Variables
Variables are so nice, sometimes we declare them even when we don’t need them. There are many valid reasons to use a variable, including abstracting the result of an expression behind its value. Assuming every variable is assigned and referenced somewhere, there are still certain variables that are always redundant!
Objects are sneaky little things… not only can they have a default member that gets implicitly invoked, they can also have a default instance that lives in the global scope and is always named after the class it’s an instance of.
Declaring a local variable to hold a copy of a reference to an object that’s already globally accessible, is always redundant! Document modules (in Excel that’s ThisWorkbook and the Worksheet modules) and UserForms always have such a default instance:
Public Sub Macro1()
Dim WB As Workbook
Set WB = ThisWorkbook 'redundant and obscures intent!
Dim Sheet As Worksheet
Set Sheet = Sheet1 'redundant, just use Sheet1 directly!
End Sub
Sprinkle Generously
Variables are a simple but powerful tool in your arsenal. Using them enhances the abstraction level of your code, practices your brain to stop and think about naming things, can help prevent binding errors and remove implicit late-binding / keep your code entirely visible to the compiler and Rubberduck. Used wisely, variables can make a huge difference between messy and redundant macro-recorder code and squeaky-clean, professionally-written VBA code.
In procedural code, a macro might be implemented in some Public Sub DoSomething procedure that proceeds to do whatever it is that it needs do, usually by dereferencing a number of library-defined objects and invoking their members in a top-to-bottom sequence of executable instructions. Clean, nicely written and well-modularized procedural code would have that be a small, high-abstraction public procedure at the top of some SomethingMacro standard module, with increasingly lower-abstraction private procedures underneath.
Looking only at scope names (the private procedures might be Function, and they would likely take parameters), the module for a MakeSalesReport macro might roughly look something like this:
Like “making coffee”, the phrase “make the sales report” is abstracting away quite a lot of smaller sub-steps.
Breaking down a problem into smaller and simpler steps and sub-steps is how we begin to achieve separation of concerns: maybe one of these sub-steps is going to require prompting the user for a filename – if that’s implemented in a separate PromptFileName function that’s only responsible for prompting the user for a filename, then it’s much easier to later (as needed) reuse that function by pulling it into its own, say, Files module, and making it Public.
If programming is a lot like writing a story, then procedures have to be the verbs we use to express the actions carried by our code. The smaller a procedure, the less it can do; the fewer things a procedure does, the easier it is to give it a name that accurately, precisely describes what it does.
Public Sub DoSomething()
'do stuff:
'...
'get the filename:
Dim FileName As String
FileName = ...
'do more stuff:
'...
End Sub
Any chunk of code that can be isolated inside a procedure scope and described with a comment that essentially says “this chunk of code reticulates splines” (whatever that is – maybe it’s “get the filename:”, or a much less subtle “======= GET FILENAME =======”), is a chunk of code that could be extracted into its own ReticulateSplines named procedure scope, and then doing this replaces a comment that says “this chunk of code reticulates splines” and the entire code block that goes with it, with a higher-abstraction single procedure call that plainly says ReticulateSplines: by properly naming the things we abstract away, we can make our code expressive and [for the most part] self-explanatory.
Option Explicit
Public Sub DoSomething()
DoStuff
Dim FileName As String
FileName = ...
DoMoreStuff FileName
End Sub
Private Sub DoStuff()
'...
End Sub
Private Sub DoMoreStuff(ByVal FileName As String)
'...
End Sub
And that’s glorious already.
With object-oriented programming (OOP), we get to further increase the abstraction level, such a Public Sub DoSomething macro procedure might belong to some Macros or EntryPoints standard module, painting an abstract broad-brush big picture… with all the spline-reticulating gory details in Private procedures of a separate class module.
Like procedures in procedural code, classes in OOP become another building block to tell our story: with class modules we get to use nouns: procedures do things, objects are things. So we could have a SomeMacro class that encapsulates everything “do something” needs to do, and when we need a DoSomethingElse macro we can implement it in its own dedicated class module too, leaving the Macros module (or EntryPoints, or whatever… just not Module1!) a high-abstraction, broad-brush picture of what’s going on.
This boils down to 1) create the dependencies of the macro class module we want to create; 2) create and initialize the “macro” object, and 3) invoke a Run method to, well, run the macro.
A standard module doing that, might look like this:
Option Explicit
Private Const ConnectionString As String = "..."
Public Sub DoSomething()
' create the dependencies...
Dim DbService As IDbService
Set DbService = SomeDbService.Create(ConnectionString)
' create the macro object, pass/inject the dependencies;
' we know SomeMacro needs a Worksheet and an IDbService
' because its Create factory method takes them as parameter:
With SomeMacro.Create(Sheet1, DbService)
.Run ' runs the macro
End With
End Sub
Public Sub DoSomethingElse()
'we could have another macro here...
'..if that other macro is in another class...
'...does it have a .Run method?
End Sub
This does effectively roughly demonstrate Dependency Injection and Inversion of Control in VBA (glossing over the required predeclared ID hidden attributes here), but in the context of this article, the point of interest is the .Run member call: if we make an object that encapsulates the notion of running a macro, it makes sense for that object to have a Run method. However if we don’t formalize this concept with an interface, we could have a SomeMacro.Run, then we could have AnotherMacro.Execute, and why not SomeOtherMacro.DoSomething: nothing is structuring things and telling the compiler and future maintainers “see this class is a macro and it has a method that runs it”, so while it’s nice that we’ve nicely cleaned up the Macros module by moving most of the code into class modules, it’s still chaos out there – unless there’s a way to get all macros to agree on exactly how we run them.
How do we tell the compiler “this class is a macro and it has a method that runs it”?
Interfaces and the Implements keyword, of course!
We can do this by adding a new class module (call it IMacro – I’m really not a prefix guy, but abstract interfaces in COM traditionally have that I prefix, and the tradition carried into C# and .NET, so here we are – if this were Java I would have just called it Macro; it’s all just conventions), and then adding a Run method with an empty body – this class shall remain abstract, and the implementation(s) shall be provided by other class modules:
'@ModuleDescription "Represents an executable macro."
'@Interface
Option Explicit
'@Description "Runs the macro."
Public Sub Run()
End Sub
The implementation(s) would be class modules with Implements IMacro and a Private Sub IMacro_Run procedure that invokes a Run procedure which… would break down into smaller, lower-abstraction private procedures underneath, and would delegate the more specialized work to more specialized objects (which would thus become that class’ dependencies). Sounds familiar?
Yep. You’re looking at your standard procedural macro, with the only difference being that instead of a standard module it’s now inside a class module that Implements IMacro.
Is this… a command pattern (macro in a class module)? Turns out, it pretty much actually is!
Of course, that’s not the whole story. But yes, it’s indeed a command pattern, however minimal – in design pattern abstraction terminology:
the caller is the Public Sub DoSomething macro procedure
the command is the IMacro interface
the concrete command is the SomeMacro class (implements IMacro)
the SomeDbService dependency would be a receiver, I think
What makes a “macro in a classs module” a command pattern, is the IMacro interface and how it abstracts the notion of “running a macro”. It represents the abstract concept of “something that can run”, and this right there, is the command pattern in a nutshell.
Let’s dig a little deeper though, because VBA can do much more than just macros, and commands are everywhere in software.
Divide & Conquer
Say we’re writing a user interface that can add, delete, and update records in a table. We might have a form featuring a ListBox control, and then CommandButton controls to create a new record, delete the selected one(s), and modify an existing one.
In a clean design without the command pattern, code might be written and organized with a “divide & conquer” attitude, and would look something like this (lower-abstraction details omitted, they’re not the point):
Option Explicit
'...
Public Property Get Model() As SomeModel
'gets an object holding the data needed for this form.
End Property
Private Sub CreateNewItem()
With New ItemEditorForm ' new form instance
.Show
If .Cancelled Then Exit Sub
AddToSource .Model ' implies the form has a Model As Something property.
End With
End Sub
Private Sub AddToSource(ByVal Thing As Something)
Model.AddThing Thing ' the Something class needs an AddThing method for this.
End Sub
Private Sub RemoveFromSource(ByVal Thing As Something)
Model.RemoveThing Thing ' the Something class needs a RemoveThing method for this.
End Sub
Private Sub DeleteSelectedItems()
Dim i As Long
For i = Me.ItemsBox.ListCount - 1 To 0 Step -1 ' assumes an ItemsBox listbox
If Me.ItemsBox.Selected(i) Then ' does not assume single-item selections
Dim Item As Something
' assumes a ListSource collection of Something objects
Set Item = ListSource(Me.ItemsBox.ListIndex)
If Not Item Is Nothing Then
RemoveFromSource Item ' <~ do this work at a lower abstraction level
End If
End If
Next
End Sub
Private Sub EditSelectedItem()
Dim Item As Something
Set Item = ListSource(Me.ItemsBox.ListIndex)
If Item Is Nothing Then Exit Sub
With New ItemEditorForm ' pop a modal with fields for an item...
Set .Model = Item ' <~ this item. (assumes a Model As Something property)
.Show
If .Cancelled Then Exit Sub
UpdateSourceItem .Model ' <~ do this work at a lower abstraction level
End With
End Sub
Private Sub CreateButton_Click()
CreateNewItem ' <~ do this work at a lower abstraction level
End Sub
Private Sub DeleteButton_Click()
DeleteSelectedItems ' <~ do this work at a lower abstraction level
End Sub
Private Sub EditButton_Click()
EditSelectedItem ' <~ do this work at a lower abstraction level
End Sub
'...
By factoring each button action into its own dedicated procedure, we get to name things and clearly split things up by functionality. The job of a Click handler becomes to fork execution elsewhere, so they [often] become simple one-liners invoking a private method, painting a broad-brush picture of what’s going on.
We could just as well implement the functionality in the body of the Click handler, but I personally find extracting these private methods worthwhile, because they make it easier to restructure things later (you can cut/move the entire scope), versus leaving that code in event handlers where the refactoring is more tedious. Event handlers are entry points in a way, enough so that having them at a high abstraction level feels exactly right for me.
Now what if we wanted the EditButton to only be enabled when only one item is selected, and then make the DeleteButton only enabled when at least one item is selected? We would have to start handling the ItemsBox.Change event, and would need additional code that might look like this:
Private Sub SetButtonsEnabledState()
Me.EditButton.Enabled = (Model.SelectedItems.Count = 1)
Me.DeleteButton.Enabled = (Model.SelectedItems.Count > 0)
'...
End Sub
Private Sub ItemsBox_Change()
SetModelSelectedItems
SetButtonsEnabledState
End Sub
Imagine a form with many more controls – each with their own “is enabled” rules and a Change event handler procedure: boilerplate… boilerplate code everywhere!
Each command button has its own associated actions implemented in its own set of procedures, and that creates a lot of noise and reduces the signal when we’re reading the code, and that’s a clear sign the abstraction level needs to go up a bit.
Abstraction Levels Think of the steps involved in making a cup of coffee, in maybe 3-5 steps. Think of a descriptive verb for each step, then think of how each step could be broken down into another 3-5 steps, and then use descriptive names for these steps, too. The names at the top level are necessarily going to be more abstract than those in the lower level(s): that’s what abstraction levels refers to. Now imagine doing all that in one giant procedure scope and you can see the benefits of balancing abstraction and indirection in programming 🙂
Moving that boilerplate to Public procedures in standard modules would “work” to clean up the form module… but then it would also pretty much defeat the purpose of encapsulating things into objects… and then when (not if) one such procedure needs any state, then that state soon becomes global state, and that is absolutely not something we want to have to resort to.
Command & Conquer
Using the command pattern (even without MVVM command bindings), a CreateButton_Click handler would still be responsible for kicking the “create a new item” logic into action… but now that logic would be living in some ICommand implementation, encapsulating its dependencies and state (and thus moving these outside of the form’s code-behind but not into global scope now).
'@Folder MVVM.Infrastructure.Abstract
'@ModuleDescription "An object that represents an executable command."
'@Interface
'@Exposed
Option Explicit
'@Description "Returns True if the command is enabled given the provided binding context (ViewModel)."
Public Function CanExecute(ByVal Context As Object) As Boolean
End Function
'@Description "Executes the command given the provided binding context (ViewModel)."
Public Sub Execute(ByVal Context As Object)
End Sub
'@Description "Gets a user-friendly description of the command."
Public Property Get Description() As String
End Property
This makes a command as an abstraction that has:
A user-friendly description of what the command does.
A function that takes a context object and returns a Boolean value that indicates whether the command can currently be executed.
An Execute procedure that takes a context object and, well, executes the command.
The mysterious Context parameter is an object that encapsulates the state, the data we’re working with. In MVVM that would be the ViewModel instance.
MVVM command bindings use the Description property to set the ControlToolTip string of a binding’s targetCommandButton object, and automatically invokes the CanExecute method as property bindings update, which automatically enables or disables the bound command button control: the command pattern works very, very well with Model-View-ViewModel, but nothing says we cannot use the command pattern without it.
So let’s strip the interface of its Description property, leaving only the CanExecute and Execute methods:
'@Folder CommandPattern.Example
'@ModuleDescription "An object that represents an executable command."
'@Interface
'@Exposed
Option Explicit
'@Description "Returns True if the command is enabled given the provided context."
Public Function CanExecute(ByVal Context As Object) As Boolean
End Function
'@Description "Executes the command given the provided context."
Public Sub Execute(ByVal Context As Object)
End Sub
We’re still going to need a Click handler in the code-behind for each CommandButton on a form, but now that we have an ICommand abstraction to code against, we can already go back to the Divide & Conquer form’s code-behind and watch it melt:
Private CreateNewItem As ICommand
Private DeletedSelectedItems As ICommand
Private EditSelectedItem As ICommand
Public Property Get Model() As Object
'gets an object holding the data needed for this form
End Property
Private Sub CreateButton_Click()
CreateNewItem.Execute Me.Model
End Sub
Private Sub DeleteButton_Click()
DeleteSelectedItems.Execute Me.Model
End Sub
Private Sub EditButton_Click()
EditSelectedItem.Execute Me.Model
End Sub
That of course is again just simplified illustrative code, but the lower-abstraction implementation details that were omitted for brevity in the “divide & conquer” code no longer need to find a place to call home, and no longer even need to be omitted either: that lower-abstraction code is simply gone from the code-behind now, and lives in a handful of distinct objects that implement the ICommand interface, such that the only thing a button’s Click handler needs to do now is to invoke a high-abstraction method that does whatever it needs to do.
At a glance, such a one-liner CreateNewItem.Execute instruction looks very similar to another one-liner CreateNewItem instruction (both involve a procedure call against an object – but only one of them is a command); the difference is that now the form is [blissfully] unaware of how that activity is going to happen, and a maintainer looking for the code that creates a new item will find it in a CreateNewItemCommand class, instead of somewhere in the middle of other specialized procedure scopes all in the same module.
Embracing Changes
Code changes, code evolves, it’s inevitable: code lives. When we code against abstractions, we reduce the code’s resistance to change. You want your code to embrace changes, you want it to welcome changes and extensions.
By coding against an ICommand interface, the only thing we commit to is that clicking a button will do something; we don’t know what and we don’t even need to care, and that’s what not resisting change means: we aren’t saying “run procedure X in module Y” anymore, we’re saying “run X implemented by any class whatsoever“. The actual code that runs the command is bound at run-time and doesn’t even need to exist for the code to compile, and the form is still fully-functional given no-op stub “commands” – we just need to get more abstract about what “to be functional” means for a form (meaning, if we click a button and ICommand.Execute is invoked, then we’re good – that’s all we need the form to do here).
The hypothetical example code above implies a separate CreateItemCommand class; it might look something like this:
Option Explicit
Implements ICommand
Private Function ICommand_CanExecute(ByVal Context As Object) As Boolean
ICommand_CanExecute = True
End Function
Private Sub ICommand_Execute(ByVal Context As Object)
With New ItemEditorForm
.Show
If .Cancelled Then Exit Sub
AddToSource .Model, Context
End With
End Sub
Private Sub AddToSource(ByVal Thing As Something, ByVal Context As Object)
Context.AddThing Thing
End Sub
Note that this is again really just moving private methods from one place into their own class, so AddToSource would be the same code as before, only now the “source” collection that needs an item added to, would live in the Context object, which we’re accessing late-bound here for simplicity’s sake, but a command implementation that works with a particular specific type of Context object should validate that, and cast the parameter into a local variable declared with the appropriate type, so as to avoid such unnecessary late binding, like this:
Private Sub DoSomething(ByVal Context As Object)
Debug.Assert TypeOf Context Is Class1
Dim LocalContext As Class1
Set LocalContext = Context '<~ type mismatch here if the assert fails
'carry on using LocalContext with early-bound member calls
End Sub
By moving the implementation out of the button’s Click handler, we make it much easier to later repurpose that button, or to make a future button elsewhere that invokes the same command. The form module doesn’t need to know about any concrete implementation of the ICommand interface: a button can be wired-up to any command, swapping SomeCommand for a SomeOtherCommand implementation is all that’s needed.
One Step Further
We’ve seen how to pull functionality from a form’s code-behind and refactor it into specialized command objects that can be invoked from a button’s Click handler. The nicest thing about such commands, is that they are full-fledged objects, which means they can be passed around as parameters – and Model-View-ViewModel (MVVM) leverages that.
In the MVVM object model, you have a top-level AppContext object that exposes an ICommandManager object: this manager is responsible for holding a reference to all command bindings in your MVVM application, and there’s an IBindingManager that notifies it whenever a property binding updates in a way that may require commands’ CanExecute method to be evaluated.
When coding against the MVVM object model, you no longer wire-up event handlers: the MVVM infrastructure automatically does it for you – so the only code that remains (that actually does anything) in a form’s code-behind, is code that wires up form controls to property and command bindings – the rest is just implementations for IView and ICancellable interfaces (as applicable), and then a factory method can initialize a bunch of properties (or the properties can be Set from outside the module, but a Create factory method works very well with UserForm classes for property injection):
Option Explicit
Implements IView
Implements ICancellable
Private Type TState
Context As MVVM.IAppContext
ViewModel As ExampleViewModel '<~ any class implementing INotifyPropertyChanged
IsCancelled As Boolean
CreateNewItem As ICommand
DeletedSelectedItems As ICommand
EditSelectedItem As ICommand
End Type
Private This As TState
'...properties...
Public Property Get ViewModel() As ExampleViewModel
Set ViewModel = This.ViewModel
End Property
Private Sub InitializeView()
With This.Context.Commands
.BindCommand ViewModel, Me.CreateButton, ViewModel.CreateNewItem
.BindCommand ViewModel, Me.DeleteButton, ViewModel.DeleteSelectedItems
.BindCommand ViewModel, Me.EditButton, ViewModel.EditSelectedItem
.BindCommand ViewModel, Me.CancelButton, CancelCommand.Create(Me)
End With
End Sub
'...interface implementations...
The UI controls are still referred to as Me.CreateButton, Me.DeleteButton, and Me.EditButton (added Me.CancelButton for good measure), but now instead of handling their Click event we bind them to ICommand objects – whose references we conveniently expose as Property Get members of our ViewModel, but we can also bind a command that we create inline, like this CancelCommand instance. Shame the QueryClose event isn’t exposed, because then binding a CancelCommand to a UserForm would be all you’d need to do for it to automagically properly close/cancel a dialog.
Note that the form doesn’t even need to know what specific ICommand implementations it’s given to work with, at all: here the form is coupled with the CancelCommand, but all other commands (create, delete, edit) are binding to public ICommand properties that live on the ViewModel object.
Full Circle: EventCommand (MVVM)
Not all commands are created equal: a command like CancelCommand is generic enough that it can work with any ICancellable object, and an AcceptCommand can work with any implementation of the IView interface. On the other hand, something feels wrong about systematically implementing any & all commands in their own classes.
Having each command neatly factored into its own class module is a great way to implement complex commands, but can be overkill when things are relatively trivial – very often the ViewModel class already has access to every object a command needs, and having a way to make the ViewModel itself implement the command would solve this.
I’m going to introduce an EventCommand class into the MVVM infrastructure code, to do exactly this:
'@Folder MVVM.Common.Commands
'@ModuleDescription "A command that allows the ViewModel to supply the implementation."
'@PredeclaredId
'@Exposed
Option Explicit
Implements ICommand
Private Type TState
Description As String
End Type
Private This As TState
Public Event OnCanExecute(ByVal Context As Object, ByRef outResult As Boolean)
Public Event OnExecute(ByVal Context As Object)
'@Description "Creates a new instance of this ICommand class. Set the returned reference to a WithEvents variable."
Public Function Create(ByVal Description As String) As ICommand
Dim Result As EventCommand
Set Result = New EventCommand
Result.Description = Description
Set Create = Result
End Function
'@Description "Gets/sets the command's Description."
Public Property Get Description() As String
Description = This.Description
End Property
Friend Property Let Description(ByVal RHS As String)
This.Description = RHS
End Property
Private Function ICommand_CanExecute(ByVal Context As Object) As Boolean
Dim outResult As Boolean
outResult = True
RaiseEvent OnCanExecute(Context, outResult)
ICommand_CanExecute = outResult
End Function
Private Property Get ICommand_Description() As String
ICommand_Description = This.Description
End Property
Private Sub ICommand_Execute(ByVal Context As Object)
RaiseEvent OnExecute(Context)
End Sub
In VBA we can’t pass functions around like we can with delegates in C#, but events are a nice language feature we can still leverage for this purpose. Code like this could be in any ViewModel class:
Private WithEvents PseudoDelegateCommand As EventCommand
'...
Private Sub Class_Initialize()
Set PseudoDelegateCommand = EventCommand.Create("Full circle!")
End Sub
'...
Private Sub PseudoDelegateCommand_OnCanExecute(ByVal Context As Object, outResult As Boolean)
'supply the ICommand.CanExecute implementation here.
'assign outResult to False to disable the command (it's True by default).
'in principle, the Context *is* the ViewModel instance, so this assertion should hold:
Debug.Assert Me Is Context
'it also means the Context parameter should probably be ignored.
End Sub
Private Sub PseudoDelegateCommand_OnExecute(ByVal Context As Object)
'supply the ICommand.Execute implementation here.
'in principle, the Context *is* the ViewModel instance, so this assertion should hold:
Debug.Assert Me Is Context
'it also means the Context parameter should probably be ignored.
'EventCommand is useful for commands that are specific to a particular ViewModel,
'and don't really need to have their implementation extracted into their own class.
End Sub
And now we’ve gone full circle and essentially moved the Click handlers out of the View …and into the ViewModel – except these aren’t Click handlers now, although they will run when a user clicks the associated button (mind-boggling, right?): we’re essentially looking at callbacks here, invoked from within the MVVM infrastructure in response to control events… and/or INotifyPropertyChanged notifications from the ViewModel.
From a testability standpoint, it’s important to understand the implications: if you intend to have your ViewModel under a thorough suite of unit tests, then an EventCommand becomes somewhat of a liability. The OnExecute handler (or OnCanExecute, for that matter) shouldn’t require dependencies that the ViewModel doesn’t already have, so that tests can property-inject stub dependencies. In other words, unless the ViewModel already depends on an abstraction to access, say, a database connection or the file system, then the handlers of an EventCommand in that class shouldn’t connect to a database or access the file system.
You’re in command
Whether it’s for a workbook with many simple (-ish) macros, or for a full-fledged MVP, MVC, or MVVM application, implementing the command pattern lets you move the code that contains your actual functionality wherever it makes the most sense to have it. Unless you’re writing a Smart UI, that place is pretty much never the code-behind of the View module. By implementing an ICommand interface directly, you can move all that code from the UI to a command class whose sole purpose is to provide that particular piece of functionality.
Using an EventCommand with MVVM, you can even move that code from the UI to literally anywhere you want, as long as that is a class module (only class modules can have a WithEvents instance variable). It’s not uncommon to see a ViewModel class include somewhat high-abstraction code that provides commands’ implementations.
See and follow github.com/rubberduck-vba/MVVM for the Model-View-ViewModel infrastructure code that makes command bindings a thing in VBA, as well as examples (including a Smart UI!) and additional documentation.
I have recently written (100% VBA) a proof-of-concept for a Model-View-ViewModel (MVVM) framework, and since the prototype works exactly as needed (with some rough edges of course)… I’ve decided to explore what Rubberduck can do to make MVVM fully supported, but going down that path poses a serious problem that needs a very good and well thought-out solution.
A Vision of a Framework
When you start a new project in Visual Studio (including 6.0 /VB6), the IDE prompts for a project type, essentially asking “what are we building today?“
In VBA the assumption is that you just want to write a bit of script to automate some document manipulation. And then the framework so to speak, is the VBA Standard Library: functions, methods, constants, and actual objects too; all globally-scoped for convenience and quick-and-easy access: a fully spelled-out VBA.Interaction.MsgBox function call is a rare sight! Combined with the nonexistence of namespaces, the flip side is that the global scope is easily polluted, and name collisions are inevitable since anything exposed by any library becomes globally accessible. This makes fully-qualified global function calls appear sporadically sprinkled in the code, which can be confusing. I digress, but what I mean to get at is that this is part of what made Microsoft make the shift to the .NET platform in the early 2000’s, and eventually abandon the Visual Basic Editor to its fate. The COM platform and Win32 API was the framework, and Win32 programming languages built on top of that.
This leaves two approaches for a vision of a “framework” for VBA:
Package a type library and ship it.
Pros: any COM-visible library will work, can be written in .NET
Cons: projects now have a hard dependency on a specific type library; updating is a mess, etc.
Embed the framework into VBA projects, pretty much like JavaScript does.
Pros: devs are in charge of everything, framework is 100% VBA and inherently open-source, updating is essentially seamless for any non-breaking change, no early-bound dependencies, graceful late-bound degradation, etc.
Cons: VBA devs and maintainers that aren’t using Rubberduck will be massively lost in the source code (framework would cleanly leverage @Folder annotations), but then when the host application allows it this could be mitigated by embedding the code into its own separate VBA project and reference it from other projects (e.g. ship an Excel add-in with the framework code your VBA project depends on).
I think I’m slightly biased here, but I think this rules out the type library approach regardless. So we need a way to make this work in VBA, with VBA source code that lives in a GitHub repository with vetted, trusted content.
Where Rubberduck fits in
Like Visual Studio, Rubberduck could prompt VBA devs with “what are we building today?” and offer to pull various “bundles” of modules from this GitHub repository into the active project. Rubberduck would request the available “bundles” from api.rubberduckvba.com, which would return with “bundle metadata” describing each “package” (is “nugget” forbidden to use as a name for these / play on “nuget” (the package manager for .NET)?), and then list them in a nice little dialog.
The “nugget” metadata would include a name, a description, and the path to each file to download for it. Every package would be the same “version”, but the tool could easily request any particular “tag” or “release” version, and/or pull from “main” or from “next” branches, and the source code / framework itself could then easily be a collaborative effort, with its own features and projects and milestones and collaborators, completely separate from the C# Rubberduck code base.
This complete decoupling from Rubberduck means you don’t need to use Rubberduck to leverage this VBA code in your VBA projects, and new tags / “releases” would be entirely independent of Rubberduck’s own release cycles. That means you’re using, say, future-Rubberduck 2.7.4 and the “nuggets” feature offers “v1.0 [main]” and “v1.1 [next]”; one day you’re still using Rubberduck 2.7.4 but now you get “v1.1 [main]”, “v1.0”, and “v1.2 [next]” to chose from, and if you updated the “nuggets” in your project from v1.0 to v1.1 then Rubberduck inspections would flag uses of any obsolete members that would now be decorated with @Obsolete annotations… it’s almost like this annotation was presciently made for this.
But before we can even think of implementing something like this and make MVVM infrastructure the very first “nugget”, we need a rock-solid framework in the first place.
Unit Tests
I had already written the prototype in a highly decoupled manner, mindful of dependencies and how things could later be tested from the outside. I’m very much not-a-zealot when it comes to things like Test-Driven Development (TDD), but I do firmly believe unit tests provide a solid safety net and documentation for everything that matters – especially if the project is to make any kind of framework, where things need to provably work.
And then it makes a wonderful opportunity to blog about writing unit tests with Rubberduck, something I really haven’t written nearly enough about.
Tests? Why?!
Just by writing these tests, I’ve found and fixed edge-case bugs and improved decoupling and cohesion by extracting (and naming!) smaller chunks of functionality into their own separate class module. The result is quite objectively better, simpler code.
Last but not least, writing testable code (let alone the tests!) in VBA makes a great way to learn these more advanced notions and concepts in a language you’re already familiar with.
If you’re new to VBA and programming in general, or if you’re not a programmer and you’re only interested in making macros, then reading any further may make your head spin a bit (if that’s already under way… I’m sorry!), so don’t hesitate to ask here or on the examples repository on GitHub if you have any questions! This article is covering a rather advanced topic, beyond classes and interfaces, but keep in mind that unit testing does not requireOOP! It just so happens that object-oriented code adhering to SOLID principles tends to be easily testable.
This is an ongoing project and I’m still working on the test suite and refactoring things; I wouldn’t want to upload the code to GitHub in its current shape, so I’ll come back here with a link once I have something that’s relatively complete.
Where to Start?
There’s a relatively small but very critical piece of functionality that makes a good place to begin in the MVVM infrastructure code (see previous article): the BindingPath class, which I’ve pulled out of PropertyBinding this week. The (still too large for its own good) PropertyBinding class is no longer concerned with the intricacies of resolving property names and values: both this.Source and this.Target are declared As IBindingPath in a PropertyBinding now, which feels exactly right.
The purpose of a BindingPath is to take a “binding context” object and a “binding path” string (the binding path is always relative to the binding context), and to resolve the member call represented there. For example, this would be a valid use of the class:
Dim Path As IBindingPath
Set Path = BindingPath.Create(Sheet1.Shapes("Shape1").TextFrame.Characters, "Text")
This Path object implements TryReadPropertyValue and TryWritePropertyValue methods that the BindingManager can invoke as needed.
'@Folder MVVM.Infrastructure.Bindings
'@ModuleDescription "An object that can resolve a string property path to a value."
'@PredeclaredId
Option Explicit
Implements IBindingPath
Private Type TState
Context As Object
Path As String
Object As Object
PropertyName As String
End Type
Private This As TState
'@Description "Creates a new binding path from the specified property path string and binding context."
Public Function Create(ByVal Context As Object, ByVal Path As String) As IBindingPath
GuardClauses.GuardNonDefaultInstance Me, BindingPath, TypeName(Me)
GuardClauses.GuardNullReference Context, TypeName(Me)
GuardClauses.GuardEmptyString Path, TypeName(Me)
Dim Result As BindingPath
Set Result = New BindingPath
Set Result.Context = Context
Result.Path = Path
Result.Resolve
Set Create = Result
End Function
'@Description "Gets/Sets the binding context."
Public Property Get Context() As Object
Set Context = This.Context
End Property
Public Property Set Context(ByVal RHS As Object)
GuardClauses.GuardDefaultInstance Me, BindingPath, TypeName(Me)
GuardClauses.GuardNullReference RHS, TypeName(Me)
GuardClauses.GuardDoubleInitialization This.Context, TypeName(Me)
Set This.Context = RHS
End Property
'@Description "Gets/Sets a string representing a property path against the binding context."
Public Property Get Path() As String
Path = This.Path
End Property
Public Property Let Path(ByVal RHS As String)
GuardClauses.GuardDefaultInstance Me, BindingPath, TypeName(Me)
GuardClauses.GuardEmptyString RHS, TypeName(Me)
GuardClauses.GuardDoubleInitialization This.Path, TypeName(Me)
This.Path = RHS
End Property
'@Description "Gets the bound object reference."
Public Property Get Object() As Object
Set Object = This.Object
End Property
'@Description "Gets the name of the bound property."
Public Property Get PropertyName() As String
PropertyName = This.PropertyName
End Property
'@Description "Resolves the Path to a bound object and property."
Public Sub Resolve()
This.PropertyName = ResolvePropertyName(This.Path)
Set This.Object = ResolvePropertyPath(This.Context, This.Path)
End Sub
Private Function ResolvePropertyName(ByVal PropertyPath As String) As String
Dim Parts As Variant
Parts = Strings.Split(PropertyPath, ".")
ResolvePropertyName = Parts(UBound(Parts))
End Function
Private Function ResolvePropertyPath(ByVal Context As Object, ByVal PropertyPath As String) As Object
Dim Parts As Variant
Parts = Strings.Split(PropertyPath, ".")
If UBound(Parts) = LBound(Parts) Then
Set ResolvePropertyPath = Context
Else
Dim RecursiveProperty As Object
Set RecursiveProperty = CallByName(Context, Parts(0), VbGet)
If RecursiveProperty Is Nothing Then Exit Function
Set ResolvePropertyPath = ResolvePropertyPath(RecursiveProperty, Right$(PropertyPath, Len(PropertyPath) - Len(Parts(0)) - 1))
End If
End Function
Private Property Get IBindingPath_Context() As Object
Set IBindingPath_Context = This.Context
End Property
Private Property Get IBindingPath_Path() As String
IBindingPath_Path = This.Path
End Property
Private Property Get IBindingPath_Object() As Object
Set IBindingPath_Object = This.Object
End Property
Private Property Get IBindingPath_PropertyName() As String
IBindingPath_PropertyName = This.PropertyName
End Property
Private Sub IBindingPath_Resolve()
Resolve
End Sub
Private Function IBindingPath_ToString() As String
IBindingPath_ToString = StringBuilder _
.AppendFormat("Context: {0}; Path: {1}", TypeName(This.Context), This.Path) _
.ToString
End Function
Private Function IBindingPath_TryReadPropertyValue(ByRef outValue As Variant) As Boolean
If This.Object Is Nothing Then Resolve
On Error Resume Next
outValue = VBA.Interaction.CallByName(This.Object, This.PropertyName, VbGet)
IBindingPath_TryReadPropertyValue = (Err.Number = 0)
On Error GoTo 0
End Function
Private Function IBindingPath_TryWritePropertyValue(ByVal Value As Variant) As Boolean
If This.Object Is Nothing Then Resolve
On Error Resume Next
VBA.Interaction.CallByName This.Object, This.PropertyName, VbLet, Value
IBindingPath_TryWritePropertyValue = (Err.Number = 0)
On Error GoTo 0
End Function
Here’s our complete “system under test” (SUT) as far as the BindingPathTests module goes. We have a Create factory method, Context and Path properties, just like the class we’re testing.
The path object is itself read-only once initialized, but the binding source may resolve to Nothing or to a different object reference over the course of the object’s lifetime: say we want a binding path to SomeViewModel.SomeObjectProperty; when we first create the binding, SomeObjectProperty might very well be Nothing, and then it’s later Set-assigned to a valid object reference. This is why the IBindingPath interface needs to expose a Resolve method, so that IPropertyBinding can invoke it as needed, as the binding is being applied.
We’ll want a test for every guard clause, and each method needs at least one test as well.
So, I’m going to add a new test module and call it BindingPathTests. Rubberduck’s templates are good-enough to depict the mechanics and how things work at a high level, but if you stick to the templates you’ll quickly find your unit tests rather boring, wordy, and repetitive: we must break out of the mold, there isn’t one true way to do this!
Rubberduck discovers unit tests in standard modules annotated with @TestModule. Test methods are any [parameterless, for now] method annotated with a @TestMethod annotation that can have a category string – the Test Explorer can group your tests using these categories. The declarations section of a test module must include a declaration (early or late bound) for an Rubberduck.AssertClass or Rubberduck.PermissiveAssertClass (both implement the same internal interface; the “permissive” one has VBA-like equality semantics, and the default one has stricter type equality requirements (a Long can’t be equal to a Double, for example). The default test template also defines a FakesProvider object, but we’re not going to need it now (if we needed to test logic that involved e.g. branching on the result of a MsgBox function call, we could hook into the MsgBox function and configure it to return what the test needs it to return, which is honestly wicked awesome). So our test module might look something like this at first:
'@Folder Tests.Bindings
'@TestModule
Option Explicit
Option Private Module
#Const LateBind = LateBindTests
#If LateBind Then
Private Assert As Object
#Else
Private Assert As Rubberduck.AssertClass
#End If
With this conditionally-compiled setup, all we need to toggle between late and early binding is to define a project-scoped conditional compilation argument: bring up the project properties and type LateBindTests=0 or LateBindTests=1 in that box, and just like that you can control conditional compilation project-wide without modifying a single module.
The first thing to do is to get the test state defined, and implement TestInitialize and TestCleanup methods that configure this state – in the case of BindingManagerTests, I’m going to add a private type and a private field to define and hold the current test state:
Private Type TState
ExpectedErrNumber As Long
ExpectedErrSource As String
ExpectedErrorCaught As Boolean
ConcreteSUT As BindingManager
AbstractSUT As IBindingManager
HandlePropertyChangedSUT As IHandlePropertyChanged
BindingSource As TestBindingObject
BindingTarget As TestBindingObject
SourcePropertyPath As String
TargetPropertyPath As String
Command As TestCommand
End Type
Private Test As TState
Unit Testing Paradigm
Test modules are special, in the sense that they aren’t (absolutely shouldn’t be anyway) accessible from any code path in the project. Rubberduck invokes them one by one when you run a command like “run all tests” or “repeat last run”. But there’s a little more to it than that, worthy of mention.
VBA being single-threaded, tests are invoked by Rubberduck on the UI/main thread, and uses a bit of trickery to keep its own UI somewhat responsive. Each module runs sequentially, and each test inside each module runs sequentially as well – but the test execution order still shouldn’t be considered deterministic, and each test should be completely independent of every other test, such that executing all tests in any given order always produces the same outcomes.
A test that makes no assertions will be green/successful. When writing unit tests, the first thing you want to see is a test that’s failing (you can’t trust a test you have never seen fail!), and with Rubberduck in order to give a test a reason to fail, you use Assert methods (wiki).
When Rubberduck begins processing a test module, it invokes the methods (again, sequentially but not in an order that should matter) marked @ModuleInitialize in the module – ideally that would be only one method.
This is where the Assert object should be assigned (the default test templates do this):
'@ModuleInitialize
Private Sub ModuleInitialize()
#If LateBind Then
'requires HKCU registration of the Rubberduck COM library.
Set Assert = CreateObject("Rubberduck.PermissiveAssertClass")
#Else
'requires project reference to the Rubberduck COM library.
Set Assert = New Rubberduck.PermissiveAssertClass
#End If
End Sub
Rubberduck’s test engine will then execute all methods (usually cleaner with only one though) annotated with @TestInitializebefore executing each test in the module; that is the best place to put the wordy setup code that would otherwise need to be in pretty much every single test of the module:
'@TestInitialize
Private Sub TestInitialize()
Dim Context As TestBindingObject
Set Context = New TestBindingObject
Set Context.TestBindingObjectProperty = New TestBindingObject
Test.Path = "TestBindingObjectProperty.TestStringProperty"
Test.PropertyName = "TestStringProperty"
Set Test.BindingSource = Context.TestBindingObjectProperty
Set Test.BindingContext = Context
Set Test.ConcreteSUT = BindingPath.Create(Test.BindingContext, Test.Path)
Set Test.AbstractSUT = Test.ConcreteSUT
End Sub
By moving the test state to module level rather than having it local to each test, we already eliminate a lot of code duplication, and the Test module variable makes a rather nifty way to access the current test state, too!
Methods annotated with @TestCleanup are automatically invoked after each test in the module; in order to avoid accidentally sharing state between tests, every object reference should be explicitly set to Nothing, and values of intrinsic data types should be explicitly reset to their respective default value:
'@TestCleanup
Private Sub TestCleanup()
Set Test.ConcreteSUT = Nothing
Set Test.AbstractSUT = Nothing
Set Test.BindingSource = Nothing
Set Test.BindingContext = Nothing
Test.Path = vbNullString
Test.PropertyName = vbNullString
Test.ExpectedErrNumber = 0
Test.ExpectedErrSource = vbNullString
Test.ExpectedErrorCaught = False
End Sub
What Goes Into the Test State?
A number of members should always be in the Test state structure:
ConcreteSUT (or just SUT) and AbstractSUT both point to the same object, through the default interface (BindingPath) and the explicit one (IBindingPath), respectively.
If the system under test class implements additional interfaces, having a pointer to the SUT object with these interfaces is also useful. For example the TState type for the BindingManager class has a HandlePropertyChangedSUT As IHandlePropertyChanged member, because the class implements this interface.
Default property values and dependency setup: we want a basic default SUT configured and ready to be tested (or fine-tuned and then tested).
ExpectedErrNumber, ExpectedErrSource, and ExpectedErrorCaught are useful when a test is expecting a given input to produce a particular specific error.
Expecting Errors
The “expected error” test method template works for its purpose, but having this on-error-assert logic duplicated everywhere is rather ugly. Consider pulling that logic into a private method instead (I’m considering adding this into Rubberduck’s test module templates):
Private Sub ExpectError()
Dim Message As String
If Err.Number = Test.ExpectedErrNumber Then
If (Test.ExpectedErrSource = vbNullString) Or (Err.Source = Test.ExpectedErrSource) Then
Test.ExpectedErrorCaught = True
Else
Message = "An error was raised, but not from the expected source. " & _
"Expected: '" & TypeName(Test.ConcreteSUT) & "'; Actual: '" & Err.Source & "'."
End If
ElseIf Err.Number <> 0 Then
Message = "An error was raised, but not with the expected number. Expected: '" & Test.ExpectedErrNumber & "'; Actual: '" & Err.Number & "'."
Else
Message = "No error was raised."
End If
If Not Test.ExpectedErrorCaught Then Assert.Fail Message
End Sub
With this infrastructure in place, the unit tests for all guard clauses in the module can look like this – it’s still effectively doing Arrange-Act-Assert like the test method templates strongly suggest, only implicitly so (each “A” is essentially its own statement, see comments in the tests below):
'@TestMethod("GuardClauses")
Private Sub Create_GuardsNullBindingContext()
Test.ExpectedErrNumber = GuardClauseErrors.ObjectCannotBeNothing '<~ Arrange
On Error Resume Next
BindingPath.Create Nothing, Test.Path '<~ Act
ExpectError '<~ Assert
On Error GoTo 0
End Sub
'@TestMethod("GuardClauses")
Private Sub Create_GuardsEmptyPath()
Test.ExpectedErrNumber = GuardClauseErrors.StringCannotBeEmpty '<~ Arrange
On Error Resume Next
BindingPath.Create Test.BindingContext, vbNullString '<~ Act
ExpectError '<~ Assert
On Error GoTo 0
End Sub
'@TestMethod("GuardClauses")
Private Sub Create_GuardsNonDefaultInstance()
Test.ExpectedErrNumber = GuardClauseErrors.InvalidFromNonDefaultInstance '<~ Arrange
On Error Resume Next
With New BindingPath
.Create Test.BindingContext, Test.Path '<~ Act
ExpectError '<~ Assert
End With
On Error GoTo 0
End Sub
And then similar tests exist for the respective guard clauses of Context and Path members. Having tests that validate that guard clauses are doing their job is great: it tells us exactly how not to use the class… and that doesn’t tell us much about what a BindingPath object actually does.
Testing the Actual Functionality
The methods we’re testing need to be written in a way that makes it possible for a test to determine whether it’s doing its job correctly or not. For functions and properties, the return value is the perfect thing to Assert on. For Sub procedures, you have to Assert on the side-effects, and have verifiable and useful, reliable ways to verify them.
These two tests validate that the BindingPath returned by the Create factory method has resolved the PropertyName and Object properties, respectively.
'@TestMethod("Bindings")
Private Sub Create_ResolvesPropertyName()
Dim SUT As BindingPath
Set SUT = BindingPath.Create(Test.BindingContext, Test.Path)
Assert.IsFalse SUT.PropertyName = vbNullString
End Sub
'@TestMethod("Bindings")
Private Sub Create_ResolvesBindingSource()
Dim SUT As BindingPath
Set SUT = BindingPath.Create(Test.BindingContext, Test.Path)
Assert.IsNotNothing SUT.Object
End Sub
I could have made multiple assertions in a test, like this…
'@TestMethod("Bindings")
Private Sub Create_ResolvesBindingSource()
Dim SUT As BindingPath
Set SUT = BindingPath.Create(Test.BindingContext, Test.Path)
Assert.IsFalse SUT.PropertyName = vbNullString
Assert.IsNotNothing SUT.Object
End Sub
The Test Explorer would say “IsFalse assert failed” or “IsNotNothing assert failed”, so it’s arguably (perhaps pragmatically so) still useful and clear enough why that test would fail (and if you had multiple Assert.IsFalse calls in a test you could provide a different message for each)… but really as a rule of thumb, tests want to have one reason to fail. If the conditions to meaningfully pass or fail a test aren’t present, use Assert.Inconclusive to report the test as such:
'@TestMethod("Bindings")
Private Sub Resolve_SetsBindingSource()
With New BindingPath
.Path = Test.Path
Set .Context = Test.BindingContext
If Not .Object Is Nothing Then Assert.Inconclusive "Object reference is unexpectedly set."
.Resolve
Assert.AreSame Test.BindingSource, .Object
End With
End Sub
'@TestMethod("Bindings")
Private Sub Resolve_SetsBindingPropertyName()
With New BindingPath
.Path = Test.Path
Set .Context = Test.BindingContext
If .PropertyName <> vbNullString Then Assert.Inconclusive "PropertyName is unexpectedly non-empty."
.Resolve
Assert.AreEqual Test.PropertyName, .PropertyName
End With
End Sub
This mechanism is especially useful when the test state isn’t in local scope and there’s a real possibility that the TestInitialize method is eventually modified and inadvertently breaks a test. Such conditional Assert.Inconclusive calls are definitely a form of defensive programming, just like having guard clauses throwing custom meaningful errors.
Note that while we know that the BindingPath.Create function invokes the Resolve method, the tests for Resolve don’t involve Create: the Path and Context are being explicitly spelled out, and the .Resolve method is invoked from a New instance.
And that’s pretty much everything there is to test in the BindingPath class.
There’s one thing I haven’t mentioned yet, that you might have caught in the TState type:
BindingSource As TestBindingObject
BindingTarget As TestBindingObject
This TestBindingObject is a test stub: it’s a dependency of the class (it’s the “binding context” of the test path) and it’s a real object, but it is implemented in a bit of a special way that the BindingPath tests don’t do justice to.
Test Stubs
Eventually Rubberduck’s unit testing framework will feature a COM-visible wrapper around Moq, a popular mocking framework for .NET that Rubberduck already uses for its own unit test requirements. When this happens Rubberduck unit tests will no longer need such “test stubs”. Instead, the framework will generate them at run-time and make them work exactly as specified/configured by a unit test, and “just like that” VBA/VB6 suddenly becomes surprisingly close to being pretty much on par with professional, current-day IDE tooling.
The ITestStub interface simply formalizes the concept:
'@Exposed
'@Folder Tests.Stubs
'@ModuleDescription "An object that stubs an interface for testing purposes."
'@Interface
Option Explicit
'@Description "Gets the number of times the specified member was invoked in the lifetime of the object."
Public Property Get MemberInvokes(ByVal MemberName As String) As Long
End Property
'@Description "Gets a string representation of the object's internal state, for debugging purposes (not intended for asserts!)."
Public Function ToString() As String
End Function
A TestStubBase “base class” provides the common implementation mechanics that every class implementing ITestStub will want to use – the idea is to use a keyed data structure to track the number of times each member is invoked during the lifetime of the object:
'@Folder Tests.Stubs
Option Explicit
Private Type TState
MemberInvokes As Dictionary
End Type
Private This As TState
'@Description "Tracks a new invoke of the specified member."
Public Sub OnInvoke(ByVal MemberName As String)
Dim newValue As Long
If This.MemberInvokes.Exists(MemberName) Then
newValue = This.MemberInvokes.Item(MemberName) + 1
This.MemberInvokes.Remove MemberName
Else
newValue = 1
End If
This.MemberInvokes.Add MemberName, newValue
End Sub
'@Description "Gets the number of invokes made against the specified member in the lifetime of this object."
Public Property Get MemberInvokes(ByVal MemberName As String) As Long
If This.MemberInvokes.Exists(MemberName) Then
MemberInvokes = This.MemberInvokes.Item(MemberName)
Else
MemberInvokes = 0
End If
End Property
'@Description "Gets a string listing the MemberInvokes cache content."
Public Function ToString() As String
Dim MemberNames As Variant
MemberNames = This.MemberInvokes.Keys
With New StringBuilder
Dim i As Long
For i = LBound(MemberNames) To UBound(MemberNames)
Dim Name As String
Name = MemberNames(i)
.AppendFormat "{0} was invoked {1} time(s)", Name, This.MemberInvokes.Item(Name)
Next
ToString = .ToString
End With
End Function
Private Sub Class_Initialize()
Set This.MemberInvokes = New Dictionary
End Sub
With this small bit of infrastructure, the TestBindingObject class is a full-fledged mock object that can increment a counter whenever a member is invoked, and that can be injected as a dependency for anything that needs an IViewModel:
'@Folder Tests.Stubs
'@ModuleDescription "An object that can stub a binding source or target for unit tests."
Option Explicit
Implements ITestStub
Implements IViewModel
Implements INotifyPropertyChanged
Private Type TState
Stub As TestStubBase
Handlers As Collection
TestStringProperty As String
TestNumericProperty As Long
TestBindingObjectProperty As TestBindingObject
Validation As IHandleValidationError
End Type
Private This As TState
Public Property Get TestStringProperty() As String
This.Stub.OnInvoke "TestStringProperty.Get"
TestStringProperty = This.TestStringProperty
End Property
Public Property Let TestStringProperty(ByVal RHS As String)
This.Stub.OnInvoke "TestStringProperty.Let"
If This.TestStringProperty <> RHS Then
This.TestStringProperty = RHS
OnPropertyChanged Me, "TestStringProperty"
End If
End Property
Public Property Get TestNumericProperty() As Long
This.Stub.OnInvoke "TestNumericProperty.Get"
TestNumericProperty = This.TestNumericProperty
End Property
Public Property Let TestNumericProperty(ByVal RHS As Long)
This.Stub.OnInvoke "TestNumericProperty.Let"
If This.TestNumericProperty <> RHS Then
This.TestNumericProperty = RHS
OnPropertyChanged Me, "TestNumericProperty"
End If
End Property
Public Property Get TestBindingObjectProperty() As TestBindingObject
This.Stub.OnInvoke "TestBindingObjectProperty.Get"
Set TestBindingObjectProperty = This.TestBindingObjectProperty
End Property
Public Property Set TestBindingObjectProperty(ByVal RHS As TestBindingObject)
This.Stub.OnInvoke "TestBindingObjectProperty.Set"
If Not This.TestBindingObjectProperty Is RHS Then
Set This.TestBindingObjectProperty = RHS
OnPropertyChanged Me, "TestBindingObjectProperty"
End If
End Property
Private Sub OnPropertyChanged(ByVal Source As Object, ByVal PropertyName As String)
Dim Handler As IHandlePropertyChanged
For Each Handler In This.Handlers
Handler.OnPropertyChanged Source, PropertyName
Next
End Sub
Private Sub Class_Initialize()
Set This.Stub = New TestStubBase
Set This.Handlers = New Collection
Set This.Validation = ValidationManager.Create
End Sub
Private Sub INotifyPropertyChanged_OnPropertyChanged(ByVal Source As Object, ByVal PropertyName As String)
OnPropertyChanged Source, PropertyName
End Sub
Private Sub INotifyPropertyChanged_RegisterHandler(ByVal Handler As IHandlePropertyChanged)
This.Handlers.Add Handler
End Sub
Private Property Get ITestStub_MemberInvokes(ByVal MemberName As String) As Long
ITestStub_MemberInvokes = This.Stub.MemberInvokes(MemberName)
End Property
Private Function ITestStub_ToString() As String
ITestStub_ToString = This.Stub.ToString
End Function
Private Property Get IViewModel_Validation() As IHandleValidationError
Set IViewModel_Validation = This.Validation
End Property
This functionality will be extremely useful when testing the actual property bindings: for example we can assert that a method was invoked exactly once, and fail a test if the method was invoked twice (and/or if it never was).
There’s a lot more to discuss about unit testing in VBA with Rubberduck! I hope this article gives a good idea of how to get the best out of Rubberduck’s unit testing feature.
Unless you’re hosted in Access, your VBA project doesn’t have access to a database engine. If you’re in Excel, it’s easy to treat the host workbook as a database and each worksheet as a table. While we can build an application that uses Excel worksheets to store data, we probably shouldn’t do that. The reasons are many, but primarily (pun …yeah, intended), we want to be able to establish bullet-proof referential integrity between records/tables; while Excel is great for many things, it’s useless for that: it’s the job of a relational database system (RDBMS), not that of any worksheet software, no matter how powerful. Power Query is very much worth looking into, but if you’re building a small CRUD (Create/Read/Update/Delete) application in VBA, you’ll want VBA code responsible for the data access – enter ADODB, …and every pitfall that comes with it.
Querying a database with ADODB is easy: just set up a connection, open it, then execute whatever SQL statement you need through the Connection, and you get the results in a Recordset object:
Dim conn As ADODB.Connection
Set conn = New ADODB.Connection
conn.Open "ConnectionString"
Dim rs As ADODB.Recordset
Set rs = conn.Execute("SELECT Field1, Field2 FROM Table1")
'...
rs.Close
conn.Close
That is great for one-timer, ad-hoc queries: things quickly get messy when you start needing multiple queries, or when your SQL statement needs to be invoked repeatedly with different values:
Dim conn As ADODB.Connection
Set conn = New ADODB.Connection
conn.Open "ConnectionString"
Dim i As Long
For i = 1 To 10
Dim rs As ADODB.Recordset
Set rs = conn.Execute("SELECT Field1, Field2 FROM Table1 WHERE Field3 = " & i)
'...
rs.Close
Next
conn.Close
This right here – WHERE SomeField = " & i, is making the database engine work harder than it needs to… and it’s costing server-side performance, because as far as the engine knows, it’s getting a different query every time – and thus computes the same execution plan over and over, every time… when it could just be reusing it. Databases are smart. Like, wicked smart… but yeah we still need to ask for the right thing!
Compare to something like this:
Const sql As String = "SELECT Field1, Field2 FROM Table1 WHERE Field3 = ?"
Dim conn As ADODB.Connection
Set conn = New ADODB.Connection
conn.Open "ConnectionString"
Dim i As Long
For i = 1 To 10
Dim cmd As ADODB.Command
Set cmd = New ADODB.Command
cmd.CommandType = adCmdText
cmd.CommandText = sql
cmd.Parameters.Append cmd.CreateParameter(Type:=adInteger, Value:= i)
Dim rs As ADODB.Recordset
Set rs = cmd.Execute
'...
rs.Close
Next
conn.Close
Oh my, so much more code, so little gain – right?
Using ADODB.Command when queries involve a WHERE (and/or VALUES) clause and user-provided (directly or not) values is not only more efficient (the cached execution plan is reused because the command string is identical every time), it’s also more secure. Concatenating user inputs into SQL command strings is a common rookie mistake, and it’s a practice that is way more widespread than it should be (regardless of the language, paradigm, or platform); your code becomes vulnerable to SQL Injection Attacks – something that may or may not be in your threat model, but that inevitably turns into… easily avoidable bugs: think of what might happen if a user entered O'Connor in that LastName field. If you’re thinking “oh that’s easy! I’ll just double-up single quotes, and fixed!“, then you’re playing a needlessly exhausting game of cat-and-mouse with the next thing that will break your clever escaping: the mouse wins.
Abstract thoughts
Much simpler to just use an ADODB.Command every time, and when you need it parameterized, to Append any number of ADODB.Parameter objects to its Parameters collection. Except, it does make a lot of code to write, every time.
What do we do when we see repetitive patterns in code? If you’re thinking “we put it in a function!” then you’re thinking abstraction and that’s exactly the right train of thought.
We’re just going to take this abstraction… and make it an object. Then think of what objects it needs in order to do its job, and abstract these objects behind interfaces too, and take these abstractions in as constructor parameters of our Create “static” factory method. Rinse & repeat until all dependencies are property-injected and all responsibilities are nicely encapsulated into their own classes. It was fun!
I wrote an original version of this functionality little while ago – you can find the original version on Code Review, and see how different/similar it is to this simplified/improved version in our Examples repositoryon GitHub.
The original was just an ADODB wrapper class though, couldn’t really be unit-tested, and was annoying to maintain because it felt very repetitive. This version is separating the type mappings from the parameter-providing logic, which makes configuring these mappings is done through an object that’s solely responsible for these mappings; it also separates the command from the connection, and abstracts away that connection enough to enable unit testing and cover quite a large part of the API – but most importantly, this version exposes adequate abstractions for the calling code to use and stub in its own unit tests.
VBA code written with this API (and the principles it demonstrates) can easily be fully testable, without ever actually hitting any database.
I can do this in the immediate pane:
?UnitOfWork.FromConnectionString("connection string").Command.GetSingleValue("SELECT Field1 FROM Table1 WHERE Id=?", 1)
I mean, it’s a contrived example, but with a valid connection string, query, and arguments, that’s all we need to get an actual parameterized ADODB command sending that 1 as an actual ADODB parameter, …and the following debug output:
Begin connect...
Connect completed. Status: 1
Begin transaction completed.
Begin execute...
Execute completed, -1 record(s) affected.
{whatever value was in Field1}
Rollback transaction completed.
Disconnect completed. Status: 1
I made DbConnection listen in on whatever events the ADODB connection is firing, pending the implementation of an adapter to expose some IDbConnectionEvents members – the idea is to end up with client code that can inject its own callbacks and do things like log such messages. In the meantime Debug.Print statements are producing this debug output, but that’s it’s an implementation detail: it doesn’t publicly expose any of these events. It couldn’t, either: the rest of the code needs to work with the IDbConnection interface, and interfaces unfortunately can’t expose events in VBA.
SecureADODB
Some might call it layered spaghetti. Others call it lasagna. I call it well-abstracted code that reads and maintains like a charm and provably works as intended. There is nothing, absolutely nothing wrong with having many class modules in a VBA project: the only problem is… well, the VBE itself:
Project Explorer is making OOP rather painful. In fact it makes any kind of modularization painful.
Code Explorer makes the VBE more OOP-friendly: now you can have folders regrouping modules by functionality rather than just by module type.
Nice, rich APIs involve many related objects, interfaces, methods – members that make up the object model the API’s client code will be working with. As long as we can keep all these classes organized, there’s no problem having many of them.
Before we look at the implementation, let’s review the interfaces and the overall structure.
Only two interfaces aren’t being stubbed for unit tests. IUnitOfWork because as the top-level object nothing in the object model consumes it. It is needed though, because client code can inject it as a dependency of some FooRepository class, and then tests can provide it with a StubUnitOfWork that implements IUnitOfWork.
The other “façade” interface is ITypeMap. This one isn’t really needed (neither is the predeclared instance of AdoTypeMappings or its Default factory method), something felt wrong with the client code without it. While the class is essentially just a dictionary / literally a map, there’s something rather elegant about depending on an ITypeMap rather than some Scripting.Dictionary.
The two dark blue interfaces are abstract factory interfaces, each with a “real” and a “stub” implementation for tests: these are very simple classes whose entire purpose is to create an object of a particular type.
If we consider IParameterProvider an implementation detail of IDbCommandBase, that leaves us with only the core stuff: IDbCommandBase, IDbCommand, and IDbConnection – everything else just revolves around these.
DbCommandBase
The old SqlCommand code had two sets of commands: “Execute” for methods you could pass a Connection to, and “QuickExecute” for methods that created a connection on-the-spot. I decided to split the two behaviors into two distinct implementation of the same interface, and that’s how I ended up with DefaultDbCommand and AutoDbCommand. As I was cleaning up the two new classes, I had to notice these two classes needed a number of common bits of functionality… as would any other implementation of IDbCommand.
In a language that supports inheritance, I would probably make the two classes inherit a third abstract “base” class where I’d implement the IDbCommand interface. In VBA, we can’t derive a class from another, or inherit members from another class: inheritance is flat-out unavailable. There’s an alternative though, and it’s arguably even better than inheritance: composition. We can put the common functionality in a third class, and then have the two implementations take an instance of that “base” class as we would any other dependency – effectively achieving what we wanted out of inheritance, but through composition.
Code is said to be “decoupled” when none of its concrete components are inter-dependent, as is apparent with the solid black “depends on” arrows here. Decoupled components can easily be swapped for other implementations, like …test stubs.
What’s wrong with inheritance?
Don’t get me wrong, inheritance is very cool: with an abstract class you can have templated methods, where a method in the base class (typically a method that implements some interface member) invokes an abstract or virtual method (typically with protected scope) that the inherited class must override and provide an implementation for. Rubberduck uses this pattern in quite a few places (inspections, notably). Without inheritance, it’s just not something that’s possible.
Inheritance is described as a “is a” relationship, while composition is more of a “has a” relationship. This is important, because when the only consideration weighting in favor of inheritance is the need for two classes to share some functionality, it’s exactly why inheritance should not be used.
Decoupling FTW
The “base” class appeared as a need to have a place for IDbCommand implementations to access shared functionality. I wanted to return disconnected recordsets, and retrieving the value of the first field of the first record of a recordset isn’t something that’s glaringly implementation-specific. The other piece of functionality I needed, was a function that creates the ADODB.Command object and adds the parameters.
Because I wanted this class to create the ADODB.Command, I needed it to be able to turn a Variant into an ADODB.Parameter through some mapping, and since I didn’t want my class to be necessarily coupled with that mapping, or anything remotely related to configuring ADODB parameters… I’m property-injecting an IParameterProvider dependency:
Public Function Create(ByVal provider As IParameterProvider) As IDbCommandBase
Errors.GuardNonDefaultInstance Me, DbCommandBase
Errors.GuardNullReference provider
Dim result As DbCommandBase
Set result = New DbCommandBase
Set result.ParameterProvider = provider
Set Create = result
End Function
Validating the command string / arguments
Since the commands are given an SQL command string to execute, and a ParamArray array of arguments that should have the same number of items as there are ? ordinal parameters in the SQL command string, we have an opportunity to catch a missing or extraneous argument before we even send the command string to the database server. And because this validation logic would have to be the same regardless of what IDbCommand implementation we’re looking at, DbCommandBase makes the best place to put it.
This implementation is probably too naive for a number of edge cases, but sufficient for most: we’re simply counting the number of ? characters in the sql string, and comparing that with the number of elements in the args array. We need to handle errors here, because if the args array is empty, evaluating UBound(args) and/or LBound(args) will throw a “subscript out of range” run-time error 9.
Public Function ValidateOrdinalArguments(ByVal sql As String, ByRef args() As Variant) As Boolean
On Error GoTo CleanFail
Dim result As Boolean
Dim expected As Long
expected = Len(sql) - Len(Replace(sql, "?", vbNullString))
Dim actual As Long
On Error GoTo CleanFail 'if there are no args, LBound/UBound are both out of bounds
actual = UBound(args) + (1 - LBound(args))
CleanExit:
result = (expected = actual)
ValidateOrdinalArguments = result
Exit Function
CleanFail:
actual = 0
Resume CleanExit
End Function
Getting a disconnected Recordset
If we created a database connection, issued a command against it, and received the recordset from ADODB.Command.Execute, and then we close the connection and return that recordset, then the calling code can’t use the data anymore: a connected recordset only works if the calling code owns the connection. So we need a way to issue a disconnected recordset, while still using an ADODB.Command. The way to do this, is to pass the command as the Source argument to Recordset.Open, and to use a static, client-side cursor:
Private Function GetDisconnectedRecordset(ByVal cmd As ADODB.Command) As ADODB.Recordset
Errors.GuardNullReference cmd
Errors.GuardNullReference cmd.ActiveConnection
Dim result As ADODB.Recordset
Set result = New ADODB.Recordset
result.CursorLocation = adUseClient
result.Open Source:=cmd, CursorType:=adOpenStatic
Set result.ActiveConnection = Nothing
Set GetDisconnectedRecordset = result
End Function
Getting a single value result
With functions to validate the parameters, create commands and get a disconnected recordset, we have everything we need for IDbCommand implementations to do their job, but if we leave it like this, we’ll end up with all implementations copying the logic of IDbCommand.GetSingleValue: best have that logic in DbCommandBase and avoid as much repetition as possible.
Private Function GetSingleValue(ByVal db As IDbConnection, ByVal sql As String, ByRef args() As Variant) As Variant
Errors.GuardEmptyString sql
Dim cmd As ADODB.Command
Set cmd = CreateCommand(db, adCmdText, sql, args)
Dim results As ADODB.Recordset
Set results = GetDisconnectedRecordset(cmd)
GetSingleValue = results.Fields.Item(0).value
End Function
Creating the command
A few things can go wrong when creating the ADODB.Command object: we need an ADODB.Connection that’s open, and the parameters must be valid. Since we’re not executing the command just yet, we don’t have to worry about everything that could go wrong actually executing the command string and processing the parameters on the server. So the strategy here is to guard against invalid inputs as much as possible, and then to handle errors when we add the parameters, and return the Command object with whatever parameters were successfully added. We don’t need to try salvaging the rest of the parameters if one blows up, since that failing parameter will fail command execution anyway, but there isn’t much we can do about it, other than perhaps throw an error and have the caller not even try to run the command – but here I decided that the server-side errors would be more useful than any custom “invalid parameter” error.
Note that the ADODB.Command object is actually created by the method-injected IDbConnection dependency. This creates a seam between the class and ADODB, despite the inherent coupling with the ADODB.Command type: it makes the command’s ActiveConnection an implementation detail of IDbConnection.CreateCommand, and that’s all I needed to make this method work with a stub connection that isn’t actually connecting to anything:
Private Function CreateCommand(ByVal db As IDbConnection, ByVal commandType As ADODB.CommandTypeEnum, ByVal sql As String, ByRef args() As Variant) As ADODB.Command
Errors.GuardNullReference db
Errors.GuardEmptyString sql
Errors.GuardExpression db.State <> adStateOpen, message:="Connection is not open."
Errors.GuardExpression Not ValidateOrdinalArguments(sql, args), message:="Arguments supplied are inconsistent with the provided command string parameters."
Dim cmd As ADODB.Command
Set cmd = db.CreateCommand(commandType, sql)
On Error GoTo CleanFail
Dim arg As ADODB.Parameter
For Each arg In this.ParameterProvider.FromValues(args)
cmd.parameters.Append arg
Next
CleanExit:
Set CreateCommand = cmd
Exit Function
CleanFail:
Resume CleanExit
End Function
DbCommand
As mentioned before, there are two implementations for the IDbCommand interface: one that creates and owns its own IDbConnection, the other that takes it in as a dependency.
This abstraction represents an object that can take an SQL statement and parameters, and return the result(s) to its caller.
DefaultDbCommand receives its IDbConnection dependency through property injection in its Create factory method.
AutoDbCommand takes a connection string and an IDbConnectionFactory instead.
UnitOfWork uses a DefaultDbCommand because the unit of work needs to own the connection, but AutoDbCommand could be used instead of a unit of work, if we just need a quick SELECT and no transaction.
Abstract Factory
IDbConnectionFactory is an Abstract Factory here. This is needed, because unit tests need to be able to inject a stub factory that produces stub connections: an abstract factory is a factory interface that creates objects of a type that is also an abstraction – in this case, IDbConnectionFactory.Create returns an IDbConnection object. Implementing this factory class is exactly as simple as you’d think – here’s DbConnectionFactory:
'@Exposed
'@Folder("SecureADODB.DbConnection")
'@ModuleDescription("An implementation of an abstract factory that creates DbConnection objects.")
Option Explicit
Implements IDbConnectionFactory
Private Function IDbConnectionFactory_Create(ByVal connString As String) As IDbConnection
Set IDbConnectionFactory_Create = DbConnection.Create(connString)
End Function
And here’s StubDbConnectionFactory:
'@Folder("Tests.Stubs")
'@ModuleDescription("A stub acting as a IDbConnectionFactory implementation.")
Option Explicit
Implements IDbConnectionFactory
Private Type TInvokeState
CreateConnectionInvokes As Long
End Type
Private this As TInvokeState
Private Function IDbConnectionFactory_Create(ByVal connString As String) As IDbConnection
this.CreateConnectionInvokes = this.CreateConnectionInvokes + 1
Set IDbConnectionFactory_Create = New StubDbConnection
End Function
Public Property Get CreateConnectionInvokes() As Long
CreateConnectionInvokes = this.CreateConnectionInvokes
End Property
The test stub is more “complex” because it tracks method invocations, so that tests can know whether & how many times any given member was invoked during a test run.
The Abstract Factory pattern is very useful with Dependency Injection: it gives us an abstraction to inject when a class needs a dependency that just cannot be injected when the object is created – the alternative would be tight coupling: if we weren’t injecting a connection factory, then the command class would’ve had to be the one invoking DbConnection.Create – tightly coupling it with the DbConnection class and instantly making unit testing impossible. An abstract factory removes the coupling and allows unit tests to inject an alternative/stub implementation of the factory that creates StubDbConnection objects.
Wrapping it all up
AutoDbConnection can very well be consumed as-is by the client code:
Dim results As ADODB.Recordset
Set results = AutoDbConnection.Create(connString, New DbConnectionFactory, DbCommandBase.Create(AdoParameterProvider.Create(AdoTypeMappings.Default))).Execute(sql)
The only problem is that, well, the dependencies need to be resolved somehow, and that means the client code is now responsible for wiring everything up. While each component has a clear purpose, explicitly creating all these objects quickly gets old and redundant: we need an object that simplifies this – enter IUnitOfWork, and now we can use this much simpler code:
Dim results As ADODB.Recordset
Set results = UnitOfWork.FromConnectionString(connString).Command.Execute(sql)
Unit of Work is a design pattern that encapsulates a transaction: each individual operation can succeed or fail, and the unit of work either succeeds or fails as a whole. These notions are abstracted in the IUnitOfWork interface:
'@Folder("SecureADODB.UnitOfWork")
'@ModuleDescription("Represents an object encapsulating a database transaction.")
'@Interface
'@Exposed
Option Explicit
'@Description("Commits the transaction.")
Public Sub Commit()
End Sub
'@Description("Rolls back the transaction.")
Public Sub Rollback()
End Sub
'@Description("Creates a new command to execute as part of the transaction.")
Public Function Command() As IDbCommand
End Function
When a UnitOfWork is created, it initiates a database transaction. When it is destroyed before the transaction is committed, the transaction gets rolled back and from the database’s point of view, it’s like nothing happened.
Transaction?
If you’re unfamiliar with database transactions, there’s an easy example to illustrate what they do: imagine you have an Accounts table, and you’re processing a transfer – you need to UPDATE the record for the source account to deduct the transfer amount, then UPDATE the record for the destination account to add the transferred amount. In a happy world where everything goes well that would be the end of it… but the world is a cruel place, and assuming the 1st command goes through, nothing guarantees nothing will blow up when sending the 2nd command. Without transactions, the funds would simply vanish: they’re gone from the first account, and they were never added to the second account. With a transaction, we can rollback everything when the 2nd operation completes, no funds vanish and the data is exactly the way it was before the transaction started.
Again, the implementation is pretty straightforward – the only peculiarity is that the class has two factory methods – one named Create that takes all the dependencies in, and another named FromConnectionString that conveniently wires up a default set of dependencies (and then passes them to the Create method to avoid duplicating code).
'@Folder("SecureADODB.UnitOfWork")
'@ModuleDescription("An object that encapsulates a database transaction.")
'@PredeclaredId
'@Exposed
Option Explicit
Implements IUnitOfWork
Private Type TUnitOfWork
Committed As Boolean
RolledBack As Boolean
Connection As IDbConnection
CommandFactory As IDbCommandFactory
End Type
Private this As TUnitOfWork
'@Description("Creates a new unit of work using default configurations.")
'@Ignore ProcedureNotUsed
Public Function FromConnectionString(ByVal connString As String) As IUnitOfWork
Dim db As IDbConnection
Set db = DbConnection.Create(connString)
Dim provider As IParameterProvider
Set provider = AdoParameterProvider.Create(AdoTypeMappings.Default)
Dim baseCommand As IDbCommandBase
Set baseCommand = DbCommandBase.Create(provider)
Dim factory As IDbCommandFactory
Set factory = DefaultDbCommandFactory.Create(baseCommand)
Set FromConnectionString = UnitOfWork.Create(db, factory)
End Function
'@Inject: just an idea.. see #https://github.com/rubberduck-vba/Rubberduck/issues/5463
Public Function Create(ByVal db As IDbConnection, ByVal factory As IDbCommandFactory) As IUnitOfWork
Errors.GuardNonDefaultInstance Me, UnitOfWork
Errors.GuardNullReference factory
Errors.GuardNullReference db
Errors.GuardExpression db.State <> adStateOpen, message:="Connection should be open."
Dim result As UnitOfWork
Set result = New UnitOfWork
Set result.CommandFactory = factory
Set result.Connection = db
Set Create = result
End Function
'@Inject: this member should only be invoked by Me.Create, where Me is the class' default/predeclared instance.
'@Ignore ProcedureNotUsed: false positive with v2.5.0.5418
Friend Property Set Connection(ByVal value As IDbConnection)
Errors.GuardDoubleInitialization this.Connection
Errors.GuardNullReference value
Set this.Connection = value
this.Connection.BeginTransaction
End Property
'@Inject: this member should only be invoked by Me.Create, where Me is the class' default/predeclared instance.
'@Ignore ProcedureNotUsed: false positive with v2.5.0.5418
Friend Property Set CommandFactory(ByVal value As IDbCommandFactory)
Errors.GuardDoubleInitialization this.CommandFactory
Errors.GuardNullReference value
Set this.CommandFactory = value
End Property
Private Sub Class_Terminate()
On Error Resume Next
If Not this.Committed Then this.Connection.RollbackTransaction
On Error GoTo 0
End Sub
Private Sub IUnitOfWork_Commit()
Errors.GuardExpression this.Committed, message:="Transaction is already committed."
Errors.GuardExpression this.RolledBack, message:="Transaction was rolled back."
On Error Resume Next ' not all providers support transactions
this.Connection.CommitTransaction
this.Committed = True
On Error GoTo 0
End Sub
Private Function IUnitOfWork_Command() As IDbCommand
Set IUnitOfWork_Command = this.CommandFactory.Create(this.Connection)
End Function
Private Sub IUnitOfWork_Rollback()
Errors.GuardExpression this.Committed, message:="Transaction is already committed."
On Error Resume Next ' not all providers support transactions
this.Connection.RollbackTransaction
this.RolledBack = True
On Error GoTo 0
End Sub
Errors
If you paid close attention to the code listings so far, you likely already noticed the many Errors.GuardXxxxx member calls scattered throughout the code. There are probably as many ways to deal with custom errors as there are VBA classes out there, this is one way. Probably not the best way, but it feels “just right” for me in this case and I think I like it enough to keep using it until the problems it creates become clearer (there’s always something). Errors is a standard private module in the project, that defines custom error codes. Okay I was lazy and deemed SecureADODBCustomError all I needed, but it could also have been an Enum with descriptive names for each custom error code.
The module simply exposes a small number of very simple Sub procedures that make it easy for the rest of the code to raise meaningful custom errors:
'@Folder("SecureADODB")
'@ModuleDescription("Global procedures for throwing common errors.")
Option Explicit
Option Private Module
Public Const SecureADODBCustomError As Long = vbObjectError Or 32
'@Description("Re-raises the current error, if there is one.")
Public Sub RethrowOnError()
With VBA.Information.Err
If .Number <> 0 Then
Debug.Print "Error " & .Number, .Description
.Raise .Number
End If
End With
End Sub
'@Description("Raises a run-time error if the specified Boolean expression is True.")
Public Sub GuardExpression(ByVal throw As Boolean, _
Optional ByVal Source As String = "SecureADODB.Errors", _
Optional ByVal message As String = "Invalid procedure call or argument.")
If throw Then VBA.Information.Err.Raise SecureADODBCustomError, Source, message
End Sub
'@Description("Raises a run-time error if the specified instance isn't the default instance.")
Public Sub GuardNonDefaultInstance(ByVal instance As Object, ByVal defaultInstance As Object, _
Optional ByVal Source As String = "SecureADODB.Errors", _
Optional ByVal message As String = "Method should be invoked from the default/predeclared instance of this class.")
Debug.Assert TypeName(instance) = TypeName(defaultInstance)
GuardExpression Not instance Is defaultInstance, Source, message
End Sub
'@Description("Raises a run-time error if the specified object reference is already set.")
Public Sub GuardDoubleInitialization(ByVal instance As Object, _
Optional ByVal Source As String = "SecureADODB.Errors", _
Optional ByVal message As String = "Object is already initialized.")
GuardExpression Not instance Is Nothing, Source, message
End Sub
'@Description("Raises a run-time error if the specified object reference is Nothing.")
Public Sub GuardNullReference(ByVal instance As Object, _
Optional ByVal Source As String = "SecureADODB.Errors", _
Optional ByVal message As String = "Object reference cannot be Nothing.")
GuardExpression instance Is Nothing, Source, message
End Sub
'@Description("Raises a run-time error if the specified string is empty.")
Public Sub GuardEmptyString(ByVal value As String, _
Optional ByVal Source As String = "SecureADODB.Errors", _
Optional ByVal message As String = "String cannot be empty.")
GuardExpression value = vbNullString, Source, message
End Sub
Most of these procedures are invoked as the first executable statement in a given scope, to raise an error given invalid parameters or internal state, such as these:
Private Sub IUnitOfWork_Commit()
Errors.GuardExpression this.Committed, message:="Transaction is already committed."
Errors.GuardExpression this.RolledBack, message:="Transaction was rolled back."
On Error Resume Next ' not all providers support transactions
this.Connection.CommitTransaction
this.Committed = True
On Error GoTo 0
End Sub
Consistently raising such errors is the single best way to ensure our objects are always in a known and usable state, because we outright forbid them to be invalid. These validation clauses are called guard clauses, hence the GuardXxxxx procedure names.
A lot of the unit tests simply verify that, given the specified conditions, the expected error is raised:
'@TestMethod("Factory Guard")
Private Sub Create_ThrowsIfNotInvokedFromDefaultInstance()
On Error GoTo TestFail
With New AutoDbCommand
On Error GoTo CleanFail
Dim sut As IDbCommand
Set sut = .Create("connection string", New StubDbConnectionFactory, New StubDbCommandBase)
On Error GoTo 0
End With
CleanFail:
If Err.Number = ExpectedError Then Exit Sub
TestFail:
Assert.Fail "Expected error was not raised."
End Sub
If each guard clause has a unit test, then the tests are effectively documenting how the objects are meant to be used. With more specific custom errors, the tests would be more accurate, but there’s a point where you need to look at what you’ve got and say “I think I can work with that”, and move on.
Audience
Obviously, one doesn’t import 20 classes into their VBA project just to send one ADODB command to a database server. However if you’re maintaining a VB6 application that uses ADODB all over the place, leaks connections, leaves recordsets dangling, …then importing this API can really help tighten up the data access code in that legacy app. Or maybe you’re writing a complex data-driven system in VBA for Excel because that’s all you’ve got, and a UnitOfWork abstraction makes sense for you.
The goal here is mostly to 1) demonstrate proper usage of ADODB.Command for secure, parameterized queries, and 2) demonstrate that Classic VB (VB6/VBA) has always had everything everyone ever needed to write full-blown object-oriented code that leverages abstraction, encapsulation, and polymorphism – making it possible to write clean and fully unit-tested code.
…and of course, it makes a great practical application of the OOP concepts discussed in many other articles on this blog. Studying the code in this project gives you insight on…
Authenticating the user of our application is a common problem, with common pitfalls – some innocuous, some fatal. It’s also a solved problem, with a fairly standard solution. Unfortunately, it’s also a problem that’s too often solved with naive, “good-enough” solutions that make any security expert twitch.
The vast majority of scenarios don’t need any custom authentication. Accessing a SQL Server database? Use Windows Authentication! Windows Auth not possible? Use SQL Authentication over a secure network! App authentication isn’t for authenticating a user with a server. More like, the application itself needs a concept of users and privileges granted to certain groups of users, and so we need to prompt the user for a user name and a password. What could possibly go wrong?
Security First: Threat Model Assessment
The first question we need to ask ourselves, is literally “what could possibly go wrong?” — as in, what are we trying to do? If the answer is along the lines of:
Enhance user experience with tailored functionality
Grouping users into “roles” for easier management
Prevent accidental misuse of features
…then you’re on the right track. However if you’re thinking more in terms of…
Prevent intentional misuse of features
Securely prevent groups of users from accessing functionalities
Securely $(anything)
…then you’re going to need another kind of approach. VBA code is not secure, period. Even if the VBA project is password-protected, the VBE can be tricked into unlocking it with some clever Win32 API calls. So, the threat model should take into account that a power user that wants to see your code… will likely succeed …pretty easily, too.
That doesn’t mean VBA code gets a pass to do everything wrong! If you’re going to do password authentication, you might as well do it right.
Where to store users’ passwords?
We’ve all done this:
Private Const ADMIN_PWD As String = "@Dm!n"
…without realizing that the code of a VBA project – even locked – is compressed into a binary file that’s zipped with the rest of the Excel host document. But nothing prevents anyone from peeking at it, say, with Notepad++
Of course it had to be on line 42.
Obviously, hard-coding passwords is the worst possible idea: we need somewhere safe, right?
Truth is, not really. You could have everything you need in a hidden worksheet that anyone can see if they like; a database server is ideal, of course, but not necessary, if parts of your host document can be used as one (looking at you too, Microsoft Access).
The reason it doesn’t matter if the “passwords table” is compromised, is because you do not store passwords, period. Not even encrypted: the “passwords table” contains nothing that can be processed (decrypted) and then used as a password.
What you store is a hash of the users’ passwords, never the passwords themselves. For example, if a user’s password was password and we hashed it with the SHA256 hashing algorithm, we would be storing the following value:
Contrary to encryption and encoding, there is by definition no way to revert a hash value back to the original string password. It’s possible that some random string that’s not passwordmight produce the same hash value (i.e. a hash collision) – but very (very very) unlikely, at least with SHA256 or higher.
There are many different hashing algorithms, producing values of varying length, at varying speeds: with cryptographically secure requirements, using slow algorithms that produce values with a low risk of collision will be preferred (harder/longer to brute-force). Other applications might use a faster MD5 hash that’s “good enough” if not very secure, for many things but a password.
Now obviously, if any two users have the same password, their SHA256 hash would be the same. If that’s a concern (it should be), then the solution is to use a salt: prepend a random string to the password, and hash the salted password string – assuming all users use a different salt value (it can be safely stored alongside the user record), then it becomes impossible to tell whether any two users have the same password just by looking at the table contents… and this is why a hidden worksheet is a perfectly fine place to store your user passwords if you can’t use a database for whatever reason.
Even if you know every user’s hash and salt value, you can’t know what password could possibly be prefixed with that particular salt value for a SHA256 hash algorithm to give exactly these specific bytes. These users all have the same password.
Storing a salted password hash prevents “translating” the hash values wholesale, using a lookup/”rainbow” table that contains common passwords and their corresponding hash representation. Even if one password is compromised, other users with the same password wouldn’t be, because their hash is different, thanks to the “salt” bytes.
Whether we code in C#, PHP, JavaScript, Python, Java, ..or VBA, there’s simply not a single valid reason to store user passwords in plain text. But how do we get that hash value out of a password string in the first place?
Hashing with VBA
There’s… no built-in support whatsoever for hashing in VBA… but nothing says we can’t make explicit late binding and the .NET Framework work for us! Note that we’re invoking the ComputeHash_2 method, because it’s an overload of the ComputeHash method that takes the byte array we want to give it. COM/VBA doesn’t support method overloading, so when .NET exposes overloads to COM, it appends _2 to the method name, _3, _4, and so on for each overload. The order depends on… the order they were written to the IDL, which means you could… just trust Stack Overflow on that one, and go with ComputeHash_2:
Public Function ComputeHash(ByVal value As String) As String
Dim bytes() As Byte
bytes = StrConv(value, vbFromUnicode)
Dim algo As Object
Set algo = CreateObject("System.Security.Cryptography.SHA256Managed")
Dim buffer() As Byte
buffer = algo.ComputeHash_2(bytes)
ComputeHash = ToHexString(buffer)
End Function
Private Function ToHexString(ByRef buffer() As Byte) As String
Dim result As String
Dim i As Long
For i = LBound(buffer) To UBound(buffer)
result = result & Hex(buffer(i))
Next
ToHexString = result
End Function
This code would feel right at home in a SHA256Managed standard module, or it could be a class that implements some IHashAlgorithm interface with a ComputeHash method – and with it we have everything we need to start handling password-based authentication in VBA …by today’s best practices.
I like having functionality neatly abstracted, so instead of just having a public ComputeHash function that computes the SHA256 hash for a given string, I’ll have a class module formalizing what a hash algorithm does:
'@Folder("Authentication.Hashing")
'@ModuleDescription("An interface representing a hashing algorithm.")
'@Interface
Option Explicit
'@Description("Computes a hash for the given string value.")
Public Function ComputeHash(ByVal value As String) As String
End Function
One implementation would be this SHA256Managed class module:
'@Folder("Authentication.Hashing")
'@PredeclaredId
Option Explicit
Implements IHashAlgorithm
Private base As HashAlgorithmBase
'@Description("Factory method creates and returns a new instance of this class.")
Public Function Create() As IHashAlgorithm
Set Create = New SHA256Managed
End Function
Private Sub Class_Initialize()
Set base = New HashAlgorithmBase
End Sub
Private Function IHashAlgorithm_ComputeHash(ByVal value As String) As String
Dim bytes() As Byte
bytes = StrConv(value, vbFromUnicode)
Dim algo As Object
Set algo = CreateObject("System.Security.Cryptography.SHA256Managed")
Dim buffer() As Byte
buffer = algo.ComputeHash_2(bytes)
IHashAlgorithm_ComputeHash = base.ToHexString(buffer)
End Function
By coding against an interface (i.e. by invoking ComputeHash off the IHashAlgorithm interface), we are making the code easier to modify later without breaking things: if a functionality needs a MD5 hash algorithm instead of SHA256, we can implement a MD5Managed class and inject that, and no client code needs to be modified, because the code doesn’t care what specific algorithm it’s working with, as long as it implements the IHashAlgorithm interface.
The HashAlgorithmBase class is intended to be used by all implementations of IHashAlgorithm, so we’re using composition to simulate inheritance here (the coupling is intended, there’s no need to inject that object as a dependency). The class simply exposes the ToHexString function, so that any hashing algorithm can get a hex string out of a byte array:
'@Folder("Authentication.Hashing")
'@ModuleDescription("Provides common functionality used by IHashAlgorithm implementations.")
Option Explicit
'@Description("Converts a byte array to a string representation.")
Public Function ToHexString(ByRef buffer() As Byte) As String
Dim result As String
Dim i As Long
For i = LBound(buffer) To UBound(buffer)
result = result & Hex(buffer(i))
Next
ToHexString = result
End Function
At this point we can already test the hashing algorithm in the immediate pane:
The next step is to create an object that’s able to take user credentials, and tell its caller whether or not the credentials are good. This is much simpler than it sounds like.
UserAuthModel
The first thing we need to address, is the data we’re going to be dealing with – the model. In the case of a dialog that’s prompting for a user name and a password, our model is going to be a simple class exposing Name and Password read/write properties, and here an IsValid property returns True if the Name and Password values aren’t empty:
'@Folder("Authentication")
Option Explicit
Private Type TAuthModel
Name As String
Password As String
IsValid As Boolean
End Type
Private this As TAuthModel
Public Property Get Name() As String
Name = this.Name
End Property
Public Property Let Name(ByVal value As String)
this.Name = value
Validate
End Property
Public Property Get Password() As String
Password = this.Password
End Property
Public Property Let Password(ByVal value As String)
this.Password = value
Validate
End Property
Public Property Get IsValid() As Boolean
IsValid = this.IsValid
End Property
Private Sub Validate()
this.IsValid = Len(this.Name) > 0 And Len(this.Password) > 0
End Sub
Since this isn’t a model for changing a password, the validation logic doesn’t need to worry about the password’s length and/or content – only that a non-empty value was provided; your mileage may vary!
If we wanted the UI to provide a ComboBox dropdown to pick a user name, then the model class would need to encapsulate an array or collection that contains the user names, and that array or collection would be provided by another component.
IAuthService
When my object-oriented brain thinks “authentication”, what shapes up in my mind is a simple interface that exposes a single Boolean-returning function that takes user credentials, and returns True when authentication succeeds with the provided credentials.
Something like this:
'@Folder("Authentication")
'@ModuleDescription("An interface representing an authentication mechanism.")
'@Interface
Option Explicit
'@Description("True if the supplied credentials are valid, False otherwise.")
Public Function Authenticate(ByVal model As UserAuthModel) As Boolean
End Function
If we have a hidden worksheet with a table containing the user names, salt values, and hashed passwords for all users, then we could implement this interface with some WorksheetAuthService class that might look like this:
'@Folder("Authentication")
'@ModuleDescription("A service responsible for authentication.")
'@PredeclaredId
Option Explicit
Implements IAuthService
Private Type TAuthService
Algorithm As IHashAlgorithm
End Type
Private Type TUserAuthInfo
Salt As String
Hash As String
End Type
Private this As TAuthService
Public Function Create(ByVal hashAlgorithm As IHashAlgorithm)
With New WorksheetAuthService
Set .Algorithm = hashAlgorithm
Set Create = .Self
End With
End Function
Public Property Get Self() As IHashAlgorithm
Set Self = Me
End Property
Public Property Get Algorithm() As IHashAlgorithm
Set Algorithm = this.Algorithm
End Property
Public Property Set Algorithm(ByVal value As IHashAlgorithm)
Set this.Algorithm = value
End Property
Private Function GetUserAuthInfo(ByVal user As String, ByRef outInfo As TUserAuthInfo) As Boolean
'gets the salt value & password hash for the specified user; returns false if user can't be retrieved.
On Error GoTo CleanFail
With PasswordsSheet.Table
Dim nameColumnIndex As Long
nameColumnIndex = .ListColumns("Name").Index
Dim saltColumnIndex As Long
saltColumnIndex = .ListColumns("Salt").Index
Dim hashColumnIndex As Long
hashColumnIndex = .ListColumns("PasswordHash").Index
Dim userRowIndex As Long
userRowIndex = Application.WorksheetFunction.Match(user, .ListColumns(nameColumnIndex).DataBodyRange, 0)
outInfo.Salt = Application.WorksheetFunction.Index(.ListColumns(saltColumnIndex).DataBodyRange, userRowIndex)
outInfo.Hash = Application.WorksheetFunction.Index(.ListColumns(hashColumnIndex).DataBodyRange, userRowIndex)
End With
GetUserAuthInfo = True
CleanExit:
Exit Function
CleanFail:
Debug.Print Err.Description
Debug.Print "Unable to retrieve authentication info for user '" & user & "'."
outInfo.Salt = vbNullString
outInfo.Hash = vbNullString
GetUserAuthInfo = False
Resume CleanExit
End Function
Private Function IAuthService_Authenticate(ByVal model As UserAuthModel) As Boolean
Dim info As TUserAuthInfo
If Not model.IsValid Or Not GetUserAuthInfo(model.Name, outInfo:=info) Then Exit Function
Dim pwdHash As String
pwdHash = this.Algorithm.ComputeHash(info.Salt & model.Password)
IAuthService_Authenticate = (pwdHash = info.Hash)
End Function
If we only look at the IAuthService_Authenticate implementation, we can easily tell what’s going on:
If for any reason we can’t identify the specified user / get its authentication info, we bail
Using the user’s Salt string, we use the hashing algorithm’s ComputeHash method to get a hash string for the specified password.
Authentication succeeds if the hashed salted password matches the stored hash string for that user.
Note how the provided model.Password string isn’t being copied anywhere, or compared against anything.
The GetUserAuthInfo function is being considered an implementation detail here, but could easily be promoted to its own IUserAuthInfoProvider interface+implementation: the role of that function is to get the Salt and PasswordHash values for a given user, and here we’re pulling that from a table on a worksheet, but other implementations could be pulling it from a database: this is a concern in its own right, and could very easily be argued to belong in its own class, abstracted behind its own interface.
IAuthView
If we’re going to have a dialog for the user to enter their credentials into, then everything we’ve seen about the Model-View-Presenter UI design pattern is applicable here – we already have our model, and now we need an abstraction for a view.
'@Folder("Authentication")
'@Interface
Option Explicit
'@Description("Shows the view as a modal dialog. Returns True unless the dialog is cancelled.")
Public Function ShowDialog() As Boolean
End Function
Public Property Get UserAuthModel() As UserAuthModel
End Property
From an abstract standpoint, the view is nothing more than a function that displays the dialog and returns False if the dialog was cancelled, True otherwise.
The concrete implementation will be a UserForm that includes two textboxes, two command buttons, and a few labels – like this:
The code-behind for the form is very simple:
Change handlers for the textboxes assign the corresponding model property
Click handlers for the command buttons simply Hide the form
A Create factory method takes a UserAuthModel object reference
Model is exposed for property injection (only the factory method uses this property)
'@Folder("Authentication")
'@PredeclaredId
Option Explicit
Implements IAuthView
Private Type TAuthDialog
UserAuthModel As UserAuthModel
IsCancelled As Boolean
End Type
Private this As TAuthDialog
Public Function Create(ByVal model As UserAuthModel) As IAuthView
If model Is Nothing Then Err.Raise 5, TypeName(Me), "Model cannot be a null reference"
Dim result As AuthDialogView
Set result = New AuthDialogView
Set result.UserAuthModel = model
Set Create = result
End Function
Public Property Get UserAuthModel() As UserAuthModel
Set UserAuthModel = this.UserAuthModel
End Property
Public Property Set UserAuthModel(ByVal value As UserAuthModel)
Set this.UserAuthModel = value
End Property
Private Sub OnCancel()
this.IsCancelled = True
Me.Hide
End Sub
Private Sub Validate()
OkButton.Enabled = this.UserAuthModel.IsValid
End Sub
Private Sub CancelButton_Click()
OnCancel
End Sub
Private Sub OkButton_Click()
Me.Hide
End Sub
Private Sub NameBox_Change()
this.UserAuthModel.Name = NameBox.Text
Validate
End Sub
Private Sub PasswordBox_Change()
this.UserAuthModel.Password = PasswordBox.Text
Validate
End Sub
Private Sub UserForm_QueryClose(Cancel As Integer, CloseMode As Integer)
If CloseMode = VbQueryClose.vbFormControlMenu Then
Cancel = True
OnCancel
End If
End Sub
Private Function IAuthView_ShowDialog() As Boolean
Me.Show vbModal
IAuthView_ShowDialog = Not this.IsCancelled
End Function
Private Property Get IAuthView_UserAuthModel() As UserAuthModel
Set IAuthView_UserAuthModel = this.UserAuthModel
End Property
The important thing to note, is that the form itself doesn’t do anything: it’s just an I/O device your code uses to interface with the user – nothing more, nothing less. It collects user-provided data into a model, and ensures the dialog validates that model.
The form knows about the UserAuthModel and its properties (Name, Password, IsValid), and nothing else. It doesn’t know how to get a list of user names to populate a dropdown so that the user can pick a name from a list (that could be done, but then the model would need a UserNames property). It doesn’t know how to verify whether the provided password string is correct. It’s …just not its job to do anything other than relay messages to & from the user.
IAuthPresenter
We have a UserAuthModel that holds the user-supplied credentials. We have a WorksheetAuthService that can take these credentials and tell us if they’re good, using any IHashAlgorithm implementation. We’re missing an object that pieces it all together, and that’s the job of a presenter.
What we want is for the code that needs an authenticated user, to be able to consume a simple interface, like this:
'@Folder("Authentication")
'@ModuleDescription("Represents an object that can authenticate the current user.")
'@Interface
Option Explicit
'@Description("True if user is authenticated")
Public Property Get IsAuthenticated() As Boolean
End Property
'@Description("Prompts for user credentials")
Public Sub Authenticate()
End Sub
Now, any class that encapsulates functionality that involves authenticating the current user can be injected with an IAuthPresenter interface, and when IsAuthenticated is True we know our user is who they say they are. And if we inject the same instance everywhere, then the user only needs to enter their credentials once for the authentication state to be propagated everywhere – without using any globals!
'@Folder("Authentication")
'@PredeclaredId
'@ModuleDescription("Represents an object responsible for authenticating the current user.")
Option Explicit
Implements IAuthPresenter
Private Type TPresenter
View As IAuthView
AuthService As IAuthService
IsAuthenticated As Boolean
End Type
Private this As TPresenter
Public Function Create(ByVal service As IAuthService, ByVal dialogView As IAuthView) As IAuthPresenter
Dim result As AuthPresenter
Set result = New AuthPresenter
Set result.AuthService = service
Set result.View = dialogView
Set Create = result
End Function
Public Property Get AuthService() As IAuthService
Set AuthService = this.AuthService
End Property
Public Property Set AuthService(ByVal value As IAuthService)
Set this.AuthService = value
End Property
Public Property Get View() As IAuthView
Set View = this.View
End Property
Public Property Set View(ByVal value As IAuthView)
Set this.View = value
End Property
Private Sub IAuthPresenter_Authenticate()
If Not this.View.ShowDialog Then Exit Sub
this.IsAuthenticated = this.AuthService.Authenticate(this.View.UserAuthModel)
End Sub
Private Property Get IAuthPresenter_IsAuthenticated() As Boolean
IAuthPresenter_IsAuthenticated = this.IsAuthenticated
End Property
At this point any standard module macro (aka entry point) can create the presenter and its dependencies:
Public Sub DoSomething()
Dim model As UserAuthModel
Set model = New UserAuthModel
Dim dialog As IAuthView
Set dialog = AuthDialogView.Create(model)
Dim algo As IHashAlgorithm
Set algo = SHA256Managed.Create()
Dim service As IAuthService
Set service = WorksheetAuthService.Create(algo)
Dim presenter As IAuthPresenter
Set presenter = AuthPresenter.Create(service, dialog)
presenter.Authenticate
If presenter.IsAuthenticated Then
MsgBox "Welcome!", vbInformation
Else
MsgBox "Access denied", vbExclamation
End If
End Sub
If this were real application code, instead of consuming the presenter it would be injecting it into some class instance, and invoking a method on that class. This composition root (where we compose the application / instantiate and inject all the dependencies) would probably be in the Workbook_Open handler, so that the authentication state can be shared between components.
Authorisation
Up to this point, we only cared for authentication, i.e. identifying the current user. While very useful, it doesn’t tell us who’s authorized to do what. Without some pretty ugly code that special-cases specific users (e.g. “Admin”), we’re pretty limited here.
One proven solution, is to use role-based authorisations. Users belong to a “group” of users, and it’s the “group” of users that’s authorized to do things, not users themselves.
In order to do this, the WorksheetAuthService implementation needs to be modified to add a RoleId member to the TUserAuthInfo, and the IAuthService.Authenticate method could return a Long instead of a Boolean, where 0 would still mean a failed authentication, but any non-zero value would be the authenticated user’s RoleId.
Roles could be defined by an enum (note the default / 0 value):
Public Enum AuthRole
Unauthorized = 0
Admin
Maintenance
Auditing
End Enum
Or, role membership could be controlled in Active Directory (AD), using security groups – in that case you’ll want your IAuthService implementation to query AD instead of a worksheet, and the IAuthPresenter implementation to hold the current user’s role ID along with its authentication status.
There are many ways to go about implementing authentication, and many implementation-specific concerns. For example, if you’re querying a database for this, you’ll want to use commands and proper parameterization to avoid the problems associated with SQL Injection vulnerabilities: maybe a user named Robert');DROP TABLE USERS;-- isn’t part of your threat model, but can Tom O'Neil log onto your system without breaking anything?
Regardless of the approach, if you’re comparing the user’s plain-text password input with the plain-text password stored in $(storage_medium), you’re doing it wrong – whether that’s in VBA or not.
Error-handling in VBA can easily get hairy. The best error handling code is no error handling code at all, and by writing our code at a high enough abstraction level, we can achieve exactly that – and leave the gory details in small, specialized, lower-abstraction procedures.
I’m growing rather fond of adapting the famous TryParse Pattern to VBA code, borrowed from the .NET landscape. Not really for performance reasons (VBA doesn’t deal with exceptions or stack traces), but for the net readability and abstraction gains. The crux of it is, you write a small, specialized function that returns a Boolean and takes a ByRef parameter for the return value – like this:
Public Function TryDoSomething(ByVal arg As String, ByRef outResult As Object) As Boolean
'only return True and set outResult to a valid reference if successful
End Function
Let the calling code decide what to do with a failure – don’t pop a MsgBox in such a function: it’s the caller’s responsibility to know what to do when you return False.
The pattern comes from methods like bool Int32.TryParse(string, out Int32) in .NET, where an exception-throwing Int32 Int32.Parse(string) equivalent method is also provided: whenever there’s a TryDoSomething method, there’s an equivalent DoSomething method that is more straightforward, but also more risky.
Applied consistently, the Try prefix tells us that the last argument is a ByRef parameter that means to hold the return value; the out prefix is Apps Hungarian (the actual original intent of “[Systems] Hungarian Notation”) that the calling code can see with IntelliSense, screaming “this argument is your result, and must be passed by reference” – even though IntelliSense isn’t showing the ByRef modifier:
This pattern is especially useful to simplify error handling and replace it with standard flow control, like If statements. For example you could have a TryFind function that takes a Range object along with something to find in that range, invokes Range.Find, and only returns True if the result isn’t Nothing:
Dim result As Range
If Not TryFind(Sheet1.Range("A:A"), "test", result) Then
MsgBox "Range.Find yielded no results.", vbInformation
Exit Sub
End If
result.Activate 'result is guaranteed to be usable here
It’s especially useful for things that can raise a run-time error you have no control over – like opening a workbook off a user-provided String input, opening an ADODB database connection, or anything else that might fail for any reason well out of your control, and all your code needs to know is whether it worked or not.
Public Function TryOpenConnection(ByVal connString As String, ByRef outConnection As ADODB.Connection) As Boolean
Dim result As ADODB.Connection
Set result = New ADODB.Connection
On Error GoTo CleanFail
result.Open connString
If result.State = adOpen Then
TryOpenConnection = True
Set outConnection = result
End If
CleanExit:
Exit Function
CleanFail:
Debug.Print "TryOpenConnection failed with error: " & Err.Description
Set result = Nothing
'Resume CleanExit
'Resume
End Function
The function returns True if the connection was successfully opened, False otherwise – regardless of whether that’s because the connection string is malformed, the server wasn’t found, or the connection timed out. If the calling code only needs to care about whether or not the connection succeeded, it’s perfect:
Dim adoConnection As ADODB.Connection
If Not TryOpenConnection(connString, adoConnection) Then
MsgBox "Could not connect to database.", vbExclamation
Exit Function
End If
'proceed to consume the successfully open connection
Note how Exit Sub/Exit Function are leveraged, to put a quick end to the doomed procedure’s misery… and let the rest of it confidently resume with the assurance that it’s working with an open connection, without a nesting level: having the rest of the procedure in an Else block would be redundant.
The .NET guideline about offering a pair of methods TryDoSomething/DoSomething are taken from Framework Design Guidelines, an excellent book with plenty of very sane conventions – but unless you’re writing a VBA Framework “library” project, it’s almost certainly unnecessary to include the error-throwing sister method. YAGNI: You Ain’t Gonna Need It.
Cool. Can it be abused though?
Of course, and easily so: any TryDoSomethingThatCouldNeverRaiseAnError method would be weird. Keep the Try prefix for methods that make you dodge that proverbial error-handling bullet. Parameters should generally passed ByVal, and if there’s a result to return, it should be returned as a Function procedure’s return value.
If a function needs to return more than one result and you find yourself using ByRef parameters for outputs, consider reevaluating its responsibilities: there’s a chance it might be doing more than it should. Or if the return values are so closely related they could be expressed as one thing, consider extracting them into a small class.
The GridCoord class in the OOP Battleship project is a great example of this: systematically passing X and Y values together everywhere quickly gets old, and turning them into an object suddenly gives us the ability to not only pass them as one single entity, but we also get to compare it with another coordinate object for equality or intersection, or to evaluate whether that other coordinate is adjacent; the object knows how to represent itself as a String value, and the rest of the code consumes it through the read-only IGridCoord interface – all that functionality would have to be written somewhere else, if X and Y were simply two Long integer values.