~ A simple CRUD library for PostgreSQL with generics in Go
Posted on Mon 24 Oct 2022 to ProgrammingI've written previously about my thoughts on ORMs in Go and how the typical Active Record style ORMs seem to be a poor fit as a whole. With the advent of generics however, I've decided to revisit this topic, and see how generics could be utilied to make working with databases easier when it comes to modelling data. The parts I'm mostly interested in is providing a thin abstraction over the typical CRUD (create, read, update, delete) operations an ORM would provide for each model. Ideally, we want to provide this abstraction in a way that makes as few assumptions about the data being modelled as possible, and that builds on top of what has already been explored with regards to query building. I'm not going to cover what generics are in this post, as the authors of the language have already done that, instead we're going to dive right in.
Note: For what we're going to be building we will be utilising the Go libraries
github.com/jackc/pgx
andgithub.com/andrewpillar/query
.
- Models and stores
- Model creation
- Model updating
- Model deletion
- Model reading
- Implementing a model
- Conclusion
Models and stores
Pretty much every ORM has a concept of a model, that is a structure of data that represents some data from a database. These typically offer the ability to seamlessy create, update, and delete records of data that they map to. For our library, we will also want a way of representing our data too, and a mechanism by which to perform CRUD operations. For this, we will have the concept of a Model, for modelling data, and a Store, for performing the CRUD operations. So, let's implement both of these, first the Model,
type ScanFunc func(dest ...any) error
type Model interface {
Primary() (string, any)
Scan(fields []string, scan ScanFunc) error
Params() map[string]any
}
the Model is implemented as an interface with the methods Primary
, Scan
, and
Params
. The Primary
method would return the name of the column used as the
primary key for the Model, and the value for that column if any. The Scan
method would be invoked when scanning data from the Model's underlying table
into said Model, this would be given the fields being scanned, and the function
to call to actually perform the scan. The Params
method to return the
parameters of the Model to be used during create and update operations. With the
Model defined, we can use this as a type parameter for our Store implementation,
type Store[M Model] struct {
*pgxpool.Pool
table string
new func() M
}
func NewStore[M Model](pool *pgxpool.Pool, table string, new func() M) *Store[M] {
return &Store[M]{
Pool: pool,
table: table,
new: new,
}
}
the Store implementation is a struct with a type parameter for anything that implements a Model. This will allow for the same Store to be used across multiple implementations of a Model interface. This takes a PostgreSQL connection pool from which to take connections from, a table, which is the table we would operate on, and a callback function for instantiating new Models, this would be used when scanning models from the database.
Model creation
With the basic implementation of both Model and Store done, we can now expand the Store to support create operations for any Model we give it. First, let's implement the query building part of creation,
func (s *Store[M]) Create(ctx context.Context, m M)) error {
p := m.Params()
cols := make([]string, 0, len(p))
vals := make([]any, 0, len(p))
for k, v := range p {
cols = append(cols, k)
vals = append(vals, v)
}
primary, _ := m.Primary()
q := query.Insert(
s.table,
query.Columns(cols...),
query.Values(vals...),
query.Returning(primary),
)
...
in this bit, we iterate over the parameters returned from m.Params
and place
them in []string
and []any
slices for the columns and values respectively.
We also call m.Primary
to get the name of the primary key column which we
want returning from the query, this will allow us to scan the value back into
the created model if it doesn't already have it, for example if the column is
an auto-incrementing integer. Now, let's implement the rest where we invoke
the query, and scan the results back in,
...
rows, err := s.Query(ctx, q.Build(), q.Args()...)
if err != nil {
return err
}
defer rows.Close()
if !rows.Next() {
if err := rows.Err(); err != nil {
return err
}
return nil
}
if err := m.Scan(s.fields(rows), rows.Scan); err != nil {
return nil
}
return nil
}
here, we pass the built query to the underlying database pool and scan the row
that is returned. You'll notice how we invoke s.fields
to get the fields from
the rows that are returned, this is something that needs to be implemented so
let's do that next,
func (s *Store[M]) fields(rows pgx.Rows) []string {
descs := rows.FieldDescriptions()
fields := make([]string, 0, len(descs))
for _, d := range descs {
fields = append(fields, string(d.Name))
}
return fields
}
the reason we pass the field names to the Scan
method, is so that the Model
will know which fields are being scanned. This reduces the assumptions that the
Store has to make about the Model and its fields, and opens up the ability for
more flexibility on the implementation of the Model.
Model updating
Next, let's implement the logic for updating a Model. This will be somewhat similar to what we did for creation,
func (s *Store[M]) Update(ctx context.Context, m M) error {
p := m.Params()
opts := make([]query.Option, 0, len(p))
for k, v := range p {
opts = append(opts, query.Set(k, query.Arg(v)))
}
primary, id := m.Primary()
opts = append(opts, query.Where(col, "=", query.Arg(id)))
if _, err := s.Exec(ctx, q.Build(), q.Args()...); err != nil {
return err
}
return nil
}
simpler than what we did for create. Here, we use the returned values from the
Primary
method to construct the WHERE
clause to ensure only that Model will
be updated. We did one assumption though, we assumed that there would be no
data that needs to be scanned back into the model, assuming that the Model
given to us at this point in time contains all the necessary data for the
update.
Model deletion
Now, let's implement the logic for deleting a Model. Again, this will be similar to what we did for update, but simpler,
func (s *Store[M]) Delete(ctx context.Context, m M) error {
col, id := m.Primary()
q := query.Delete(s.table, query.Where(col, "=", query.Arg(id)))
if _, err := s.Exec(ctx, q.Build(), q.Args()...); err != nil {
return err
}
return nil
}
again, we use Primary
to construct the WHERE
clause for that specific Model.
Model reading
With the create, update, and delete operations implemented, we now need to
implement the read operations. First, let's implement the Select
and All
methods for reading multiple Models,
func (s *Store[M]) Select(ctx context.Context, cols []string, opts ...query.Option) ([]M, error) {
opts = append([]query.Option{
query.From(s.table),
}, opts...)
q := query.Select(query.Columns(cols...), opts...)
rows, err := s.Query(ctx, q.Build(), q.Args()...)
if err != nil {
return nil, err
}
defer rows.Close()
fields := s.rows(fields)
mm := make([]M, 0)
for rows.Next() {
m := s.new()
if err := m.Scan(fields, rows.Scan); err != nil {
return nil, err
}
}
if err := rows.Err(); err != nil {
return nil, err
}
return mm, nil
}
func (s *Store[M]) All(ctx context.Context, opts ...query.Option) ([]M, error) {
return s.Select(ctx, []string{"*"}, opts...)
}
the Select
method is what does most of the leg work, and will only select the
given columns via cols
, useful if you only want to load in a handful of
parameters for a given Model. Select
also makes use of the s.new
callback
for creating new Models to be scanned. The All
method simply calls Select
and specifies that every column should be selected.
With both Select
and All
implemented for returning multiple Models, let's
now implement Get
for returning a single Model,
func (s *Store[M]) Get(ctx context.Context, opts ...query.Option) (M, bool, error) {
var zero M
opts = append([]query.Option{
query.From(s.table),
}, opts...)
q := query.Select(query.Columns(cols...), opts...)
rows, err := s.Query(ctx, q.Build(), q.Args()...)
if err != nil {
return zero, false, err
}
defer rows.Close()
if !rows.Next() {
if err := rows.Err(); err != nil {
return zero, false, err
}
return zero, false, nil
}
m := s.new()
if err := m.Scan(s.fields(rows), rows.Scan); err != nil {
return zero, false, err
}
return m, true, nil
}
for Get
we return a bool
as the second value, this is used as a simple flag
to denote if there was any Model that was found. At the very top of the method
we define var zero M
, this is simply the zero value of the Model that we can
return along side an error, or false
. Again, we use the s.new
callback to
create the Model for scanning.
Implementing a model
We have our Store implemented, and a Model interface. Now, let's implement a simple Model that can utilise the Store,
func Scan(desttab map[string]any, fields []string, scan ScanFunc) error {
dest := make([]any, 0, len(fields))
for _, fld := range fields {
if p, ok := desttab[fld]; ok {
dest = append(dest, p)
}
}
return scan(dest...)
}
type User struct {
ID int64
Email string
Username string
Password []byte
CreatedAt time.Time
}
func (u *User) Primary() (string, any) {
return "id", u.ID
}
func (u *User) Scan(fields []string, scan ScanFunc) error {
return Scan(map[string]any{
"id": &u.ID,
"email": &u.Email,
"username": &u.Username,
"password": &u.Password,
"created_at": &u.CreatedAt,
}, fields, scan)
}
func (u *User) Params() map[string]any {
return map[string]any{
"email": u.Email,
"username": u.Username,
"password": u.Password,
"created_at": u.CreatedAt,
}
}
you'll notice that with the above implementation we invoke a function called
Scan
within the implementation of the User.Scan
method. This is a utility
method that can be used across multiple Model implementations for scanning in
fields. This will ensure that only the fields we are given will be scanned,
provided that they exist in the given desttab
.
With the above Model implementation we can now create a Store for it like so,
users := NewStore[*User](pool, "users", func() *User {
return &User{}
})
then, we can use the Store too,
err := users.Create(ctx, &User{
Email: "gordon.freeman@blackmesa.com",
Username: "gordon.freeman",
Password: []byte("secret"),
CreatedAt: time.Now(),
})
if err != nil {
// Handle the error.
}
fmt.Println("user created with id of", u.ID)
u, ok, err := users.Get(ctx, query.Where("username", "=", query.Arg("gordon.freeman")))
if err != nil {
// Handle the error.
}
if !ok {
fmt.Println("user not found")
}
u.Password = []byte("unforseenconsequences")
if err := users.Update(ctx, u); err != nil {
// Handle the error.
}
Conclusion
I know that I dove in pretty quickly with this. What I wanted to explore was the possibility of creating an ORM-like experience in Go with generics. I wanted to implement something that was simple and extensible, and that did not rely on direct reflection to achieve what an ORM typically does, whilst making minimal assumptions about the data being modelled.
The above code can be found at the following GitHub Gist, licensed under MIT, feel free to drop it into your code base and modify as you see fit. This won't be published as a conventional library, since it is less than 200 lines of code, and does not warrant being a library in my opinion. Furthermore, as I have stated, I think an ORM, if you could call this that, should make as few assumptions about the data being worked with as possible. Allowing you, the developer, to drop this into your code base, opens it up for extending even further according to your needs.