Phoenix Todos - Back-end Authentication

This post is written as a set of Literate Commits. The goal of this style is to show you how this program came together from beginning to end.

Each commit in the project is represented by a section of the article. Click each section's header to see the commit on Github, or check out the repository and follow along.

Written by Pete Corey on Sep 14, 2016.

Enter Guardian

Now we’re getting to the meat of our authentication system. We have our User model set up, but we need to associate users with active sessions.

This is where Guardian comes in. Guardian is an authentication framework that leverages JSON Web Tokens (JWT) and plays nicely with Phoenix Channels.

To use Guardian, we’ll first add it as a depenency to our application:


{:guardian, "~> 0.12.0"}

Next, we need to do some configuring:


config :guardian, Guardian,
  allowed_algos: ["HS512"], # optional
  verify_module: Guardian.JWT,  # optional
  issuer: "PhoenixTodos",
  ttl: { 30, :days },
  verify_issuer: true, # optional
  secret_key: %{"kty" => "oct", "k" => System.get_env("GUARDIAN_SECRET_KEY")},
  serializer: PhoenixTodos.GuardianSerializer

You’ll notice that I’m pulling my secret_key from my system’s environment variables. It’s a bad idea to keep secrets in version control.

I also specified a serializer module. This is Guardian’s bridge into your system. It acts as a translation layer between Guardian’s JWT and your User model.

Because it’s unique to our system, we’ll need to build the PhoenixTodos.GuardianSerializer ourselves.

Our serializer will need two fuctions. The first, for_token translates a User model into a token string. An invalid User should return an :error:


test "generates token for valid user", %{user: user} do
  assert {:ok, _} = GuardianSerializer.for_token(user)
end

test "generates error for invalid user", %{} do
  assert {:error, "Invalid user"} = GuardianSerializer.for_token(%{})
end

Thanks to Elixir’s pattern matching, for_token is a very simple function:


def for_token(%User{id: id}), do: {:ok, "User:#{id}"}
def for_token(_), do: {:error, "Invalid user"}

Similarly, we need to define a from_token function, which takes a token string and returns the corresponding User model:


test "finds user from valid token", %{user: user} do
  {:ok, token} = GuardianSerializer.for_token(user)
  assert {:ok, _} = GuardianSerializer.from_token(token)
end

test "doesn't find user from invalid token", %{} do
  assert {:error, "Invalid user"} = GuardianSerializer.from_token("bad")
end

To implement this, we’ll pull the User id out of the token string, and look it up in the database:


def from_token("User:" <> id), do: {:ok, Repo.get(User, String.to_integer(id))}
def from_token(_), do: {:error, "Invalid user"}

Now that we’ve finished our serializer, we’re in a position to wire up the rest of our authentication system!

config/config.exs

... binary_id: false + +config :guardian, Guardian, + allowed_algos: ["HS512"], # optional + verify_module: Guardian.JWT, # optional + issuer: "PhoenixTodos", + ttl: { 30, :days }, + verify_issuer: true, # optional + secret_key: %{"kty" => "oct", "k" => System.get_env("GUARDIAN_SECRET_KEY")}, + serializer: PhoenixTodos.GuardianSerializer

lib/phoenix_todos/guardian_serializer.ex

+defmodule PhoenixTodos.GuardianSerializer do + @behavior Guardian.Serializer + + alias PhoenixTodos.{User, Repo} + + def for_token(%User{id: id}), do: {:ok, "User:#{id}"} + def for_token(_), do: {:error, "Invalid user"} + + def from_token("User:" <> id), do: {:ok, Repo.get(User, String.to_integer(id))} + def from_token(_), do: {:error, "Invalid user"} +end

mix.exs

... {:mix_test_watch, "~> 0.2", only: :dev}, - {:comeonin, "~> 2.0"}] + {:comeonin, "~> 2.0"}, + {:guardian, "~> 0.12.0"}] end

mix.lock

-%{"comeonin": {:hex, :comeonin, "2.5.2"}, +%{"base64url": {:hex, :base64url, "0.0.1"}, + "comeonin": {:hex, :comeonin, "2.5.2"}, "connection": {:hex, :connection, "1.0.4"}, "gettext": {:hex, :gettext, "0.11.0"}, + "guardian": {:hex, :guardian, "0.12.0"}, + "jose": {:hex, :jose, "1.8.0"}, "mime": {:hex, :mime, "1.0.1"}, "postgrex": {:hex, :postgrex, "0.11.2"}, - "ranch": {:hex, :ranch, "1.2.1"}} + "ranch": {:hex, :ranch, "1.2.1"}, + "uuid": {:hex, :uuid, "1.1.4"}}

test/lib/guardian_serializer_test.exs

+defmodule PhoenixTodos.GuardianSerializerTest do + use ExUnit.Case, async: true + + alias PhoenixTodos.{User, Repo, GuardianSerializer} + + setup_all do + user = User.changeset(%User{}, %{ + email: "email@example.com", + password: "password" + }) + |> Repo.insert! + + {:ok, user: user} + end + + test "generates token for valid user", %{user: user} do + assert {:ok, _} = GuardianSerializer.for_token(user) + end + + test "generates error for invalid user", %{} do + assert {:error, "Invalid user"} = GuardianSerializer.for_token(%{}) + end + + test "finds user from valid token", %{user: user} do + {:ok, token} = GuardianSerializer.for_token(user) + assert {:ok, _} = GuardianSerializer.from_token(token) + end + + test "doesn't find user from invalid token", %{} do + assert {:error, "Invalid user"} = GuardianSerializer.from_token("bad") + end +end

Sign-Up Route and Controller

The first step to implementing authentication in our application is creating a back-end sign-up route that creates a new user in our system.

To do this, we’ll create an "/api/users" route that sends POST requests to the UserController.create function:


post "/users", UserController, :create

We expect the user’s email and password to be sent as parameters to this endpoint. UserController.create takes those params, passes them into our User.changeset, and then attempts to insert the resulting User into the database:


User.changeset(%User{}, params)
|> Repo.insert

If the insert fails, we return the changeset errors to the client:


conn
|> put_status(:unprocessable_entity)
|> render(PhoenixTodos.ApiView, "error.json", error: changeset)

Otherwise, we’ll use Guardian to sign the new user’s JWT and return the jwt and user objects down to the client:


{:ok, jwt, _full_claims} = Guardian.encode_and_sign(user, :token)
conn
|> put_status(:created)
|> render(PhoenixTodos.ApiView, "data.json", data: %{jwt: jwt, user: user})

Now all a user needs to do to sign up with our Todos application is send a POST request to /api/users with their email and password. In turn, they’ll receive their JWT which they can send along with any subsequent requests to verify their identity.

test/controllers/user_controller_test.exs

+defmodule PhoenixTodos.UserControllerTest do + use PhoenixTodos.ConnCase + + test "creates a user", %{conn: conn} do + conn = post conn, "/api/users", user: %{ + email: "email@example.com", + password: "password" + } + %{ + "jwt" => _, + "user" => %{ + "id" => _, + "email" => "email@example.com" + } + } = json_response(conn, 201) + end + + test "fails user validation", %{conn: conn} do + conn = post conn, "/api/users", user: %{ + email: "email@example.com", + password: "pass" + } + %{ + "errors" => [ + %{ + "password" => "should be at least 5 character(s)" + } + ] + } = json_response(conn, 422) + end +end

web/controllers/user_controller.ex

+defmodule PhoenixTodos.UserController do + use PhoenixTodos.Web, :controller + + alias PhoenixTodos.{User, Repo} + + def create(conn, %{"user" => params}) do + User.changeset(%User{}, params) + |> Repo.insert + |> handle_insert(conn) + end + + defp handle_insert({:ok, user}, conn) do + {:ok, jwt, _full_claims} = Guardian.encode_and_sign(user, :token) + conn + |> put_status(:created) + |> render(PhoenixTodos.ApiView, "data.json", data: %{jwt: jwt, user: user}) + end + defp handle_insert({:error, changeset}, conn) do + conn + |> put_status(:unprocessable_entity) + |> render(PhoenixTodos.ApiView, "error.json", error: changeset) + end +end

web/models/user.ex

... use PhoenixTodos.Web, :model + @derive {Poison.Encoder, only: [:id, :email]}

web/router.ex

... + scope "/api", PhoenixTodos do + pipe_through :api + + post "/users", UserController, :create + end + scope "/", PhoenixTodos do ... - # Other scopes may use custom stacks. - # scope "/api", PhoenixTodos do - # pipe_through :api - # end end

web/views/api_view.ex

+defmodule PhoenixTodos.ApiView do + use PhoenixTodos.Web, :view + + def render("data.json", %{data: data}) do + data + end + + def render("error.json", %{error: changeset = %Ecto.Changeset{}}) do + errors = Enum.map(changeset.errors, fn {field, detail} -> + %{} |> Map.put(field, render_detail(detail)) + end) + + %{ errors: errors } + end + + def render("error.json", %{error: error}), do: %{error: error} + + def render("error.json", %{}), do: %{} + + defp render_detail({message, values}) do + Enum.reduce(values, message, fn {k, v}, acc -> String.replace(acc, "%{#{k}}", to_string(v)) end) + end + + defp render_detail(message) do + message + end + +end

Sign-In Route and Controller

Now that users have the ability to join our application, how will they sign into their accounts?

We’ll start implementing sign-in functionality by adding a new route to our Phoenix application:


post "/sessions", SessionController, :create

When a user sends a POST request to /sessions, we’ll route them to the create function in our SessionController module. This function will attempt to sign the user in with the credentials they provide.

At a high level, the create function will be fairly straight-forward. We want to look up the user based on the email they gave, check if the password they supplied matches what we have on file:


def create(conn, %{"email" => email, "password" => password}) do
  user = get_user(email)
  user
  |> check_password(password)
  |> handle_check_password(conn, user)
end

If get_user returns nil, we couldn’t find the user based on the email address they provided. In that case, we’ll return false from check_password:


defp check_password(nil, _password), do: false

Otherwise, we’ll use Comeonin to compare the hashed password we have saved in encrypted_password with the hash of the password the user provided:


defp check_password(user, password) do
  Comeonin.Bcrypt.checkpw(password, user.encrypted_password)
end

If all goes well, we’ll return a jwt and the user object for the now-authenticated user:


render(PhoenixTodos.ApiView, "data.json", data: %{jwt: jwt, user: user})

We can test this sign-in route/controller combination just like we’ve tested our sign-up functionality.

test/controllers/session_controller_test.exs

+defmodule PhoenixTodos.SessionControllerTest do + use PhoenixTodos.ConnCase + + alias PhoenixTodos.{User, Repo} + + test "creates a session", %{conn: conn} do + %User{} + |> User.changeset(%{ + email: "email@example.com", + password: "password" + }) + |> Repo.insert! + + conn = post conn, "/api/sessions", email: "email@example.com", password: "password" + %{ + "jwt" => _jwt, + "user" => %{ + "id" => _id, + "email" => "email@example.com" + } + } = json_response(conn, 201) + end + + test "fails authorization", %{conn: conn} do + conn = post conn, "/api/sessions", email: "email@example.com", password: "wrong" + %{ + "error" => "Unable to authenticate" + } = json_response(conn, 422) + end +end

web/controllers/session_controller.ex

+defmodule PhoenixTodos.SessionController do + use PhoenixTodos.Web, :controller + + alias PhoenixTodos.{User, Repo} + + def create(conn, %{"email" => email, "password" => password}) do + user = get_user(email) + user + |> check_password(password) + |> handle_check_password(conn, user) + end + + defp get_user(email) do + Repo.get_by(User, email: String.downcase(email)) + end + + defp check_password(nil, _password), do: false + defp check_password(user, password) do + Comeonin.Bcrypt.checkpw(password, user.encrypted_password) + end + + defp handle_check_password(true, conn, user) do + {:ok, jwt, _full_claims} = Guardian.encode_and_sign(user, :token) + conn + |> put_status(:created) + |> render(PhoenixTodos.ApiView, "data.json", data: %{jwt: jwt, user: user}) + end + defp handle_check_password(false, conn, _user) do + conn + |> put_status(:unprocessable_entity) + |> render(PhoenixTodos.ApiView, "error.json", error: "Unable to authenticate") + end + +end

web/router.ex

... plug :accepts, ["json"] + plug Guardian.Plug.VerifyHeader + plug Guardian.Plug.LoadResource end ... post "/users", UserController, :create + + post "/sessions", SessionController, :create end

Sign-Out Route and Controller

The final piece of our authorization trifecta is the ability for users to sign out once they’ve successfully joined or signed into the application.

To implement sign-out functionality, we’ll want to create a route that destroys a user’s session when its called by an authenticated user:


delete "/sessions", SessionController, :delete

This new route points to SessionController.delete. This function doesn’t exist yet, so let’s create it:


def delete(conn, _) do
  conn
  |> revoke_claims
  |> render(PhoenixTodos.ApiView, "data.json", data: %{})
end

revoke_claims will be a private function that simply looks up the current user’s token and claims, and then revokes them:


{:ok, claims} = Guardian.Plug.claims(conn)
Guardian.Plug.current_token(conn)
|> Guardian.revoke!(claims)

In implementing this feature, we cleaned up our SessionControllerTest module a bit. We added a create_user function, which creates a user with a given email address and password, and a create_session function that logs that user in.

Using those functions we can create a user’s session, and then construct a DELETE request with the user’s JWT (session_response["jwt"]) in the "authorization" header. If this request is successful, we’ve successfully deleted the user’s session.

test/controllers/session_controller_test.exs

... - test "creates a session", %{conn: conn} do + defp create_user(email, password) do %User{} |> User.changeset(%{ - email: "email@example.com", - password: "password" - }) + email: email, + password: password + }) |> Repo.insert! + end - conn = post conn, "/api/sessions", email: "email@example.com", password: "password" - %{ - "jwt" => _jwt, - "user" => %{ - "id" => _id, - "email" => "email@example.com" - } - } = json_response(conn, 201) + defp create_session(conn, email, password) do + post(conn, "/api/sessions", email: email, password: password) + |> json_response(201) + end + + test "creates a session", %{conn: conn} do + create_user("email@example.com", "password") + + response = create_session(conn, "email@example.com", "password") + + assert response["jwt"] + assert response["user"]["id"] + assert response["user"]["email"] end ... end + + test "deletes a session", %{conn: conn} do + create_user("email@example.com", "password") + session_response = create_session(conn, "email@example.com", "password") + + conn + |> put_req_header("authorization", session_response["jwt"]) + |> delete("/api/sessions") + |> json_response(200) + end + end

web/controllers/session_controller.ex

... + def delete(conn, _) do + conn + |> revoke_claims + |> render(PhoenixTodos.ApiView, "data.json", data: %{}) + end + + defp revoke_claims(conn) do + {:ok, claims} = Guardian.Plug.claims(conn) + Guardian.Plug.current_token(conn) + |> Guardian.revoke!(claims) + conn + end + def create(conn, %{"email" => email, "password" => password}) do

web/router.ex

... post "/sessions", SessionController, :create + delete "/sessions", SessionController, :delete end

Final Thoughts

As a Meteor developer, it seems like we’re spending an huge amount of time implementing authorization in our Phoenix Todos application. This functionality comes out of the box with Meteor!

The truth is that authentication is a massive, nuanced problem. Meteor’s Accounts system is a shining example of what Meteor does right. It abstracts away an incredibly tedious, but extremely important aspect of building web applications into an easy to use package.

On the other hand, Phoenix’s approach of forcing us to implement our own authentication system has its own set of benefits. By implementing authentication ourselves, we always know exactly what’s going on in every step of the process. There is no magic here. Complete control can be liberating.

Check back next week when we turn our attention back to the front-end, and wire up our sign-up and sign-in React templates!

Rewriting History

Written by Pete Corey on Sep 12, 2016.

If you’ve been following our blog, you’ll notice that we’ve been writing lots of what we’re calling “literate commit” posts.

The goal of a literate commit style post is to break down each Git commit into a readable, clear explanation of the code change. The idea is that this chronological narrative helps tell the story of how a piece of software came into being.

Combined with tools like git blame and git log you can even generate detailed histories for small, focused sections of the codebase.

But sometimes generating repositories with this level of historical narrative requires something that most Git users warn against: rewriting history.

Why Change the Past

It’s usually considered bad practice to modify a project’s revision history, and in most cases this is true. However, there are certain situations where changing history is the right thing to do.

In our case, the main artifact of each literate commit project is not the software itself; it’s the revision history. The project serves as a lesson or tutorial.

In this situation, it might make sense to revise a commit message for clarity. Maybe we want to break a single, large commit into two separate commits so that each describes a smaller piece of history. Or, maybe while we’re developing we discover a small change that should have been included in a previous commit. Rather than making an “Oops, I should have done this earlier” commit, we can just change our revision history and include the change in the original commit.

It’s important to note that in these situations, it’s assumed that only one person will be working with the repository. If multiple people are contributing, editing revision history is not advised.

In The Beginning…

Imagine that we have some boilerplate that we use as a base for all of our projects. Being good developers, we keep track of its revision history using Git, and possibly host it on an external service like GitHub.

Starting a new project with this base might look something like this:


mkdir my_project
cd my_project
git clone https://github.com/pcorey/base .
git remote remove origin
git remote add origin https://github.com/pcorey/my_project

We’ve cloned base into the my_project directory, removed it’s origin pointer to the base repository, and replaced it with a pointer to a new my_project repository.

Great, but we’re still stuck with whatever commits existed in the base project before we cloned it into my_project. Those commits most likely don’t contribute to the narrative of this specific project and should be changed.

One solution to this problem is to clobber the Git history by removing the .git folder, but this is the nuclear option. There are easier ways of accomplishing our goal.

The --root flag of the git rebase command lets us revise every commit in our project, including the root commit. This means that we can interactively rebase and reword the root commits created in the base project:


git rebase -i --root master

reword f784c6a First commit
# Rebase f784c6a onto 5d85358 (1 command(s))

Using reword tells Git that we’d like to use the commit, but we want to modify its commit message. In our case, we might want to explain the project we’re starting and discuss the base set of files we pulled into the repository.

Splicing in a Commit

Next, let’s imaging that our project has three commits. The first commit sets up our project’s boilerplate. The second commit adds a file called foo.js, and the third commit updates that file:


git log --online

1d5f372 Updated foo.js
873641e Added foo.js
b3065c9 Project setup

What if we forgot to create a file called bar.js after we created foo.js. For maximum clarity, we want this file to be created in a new commit following 873641e. How would we do it?

Once again, interactive rebase comes to the rescue. While doing a root rebase, we can mark 873641e as needing editing:


git rebase -i --root master

pick b3065c9 Project setup
edit 873641e Added foo.js
pick 1d5f372 Updated foo.js

After rebasing, our Git HEAD will point to 873641e. Our git log looks like this:


git log --online

1d5f372 Updated foo.js
873641e Added foo.js

We can now add bar.js and commit the change:


touch bar.js
git add bar.js
git commit -am "Added bar.js"

Reviewing our log, we’ll see a new commit following 873641e:


git log --online

58f31fd Added bar.js
41817a4 Added foo.js
81df941 Project setup

Everything looks good. Now we can continue our rebase and check out our final revision history:


git rebase --continue
git log --oneline

b8b7b18 Updated foo.js
58f31fd Added bar.js
41817a4 Added foo.js
81df941 Project setup

We’ve successfully injected a commit into our revision history!

Revising a Commit

What if we notice a typo in our project that was introduced by our boilerplate? We don’t want to randomly include a typo fix in our Git history; that will distract from the overall narrative. How would we fix this situation?

Once again, we’ll harness the power of our interactive root rebase!


git rebase -i --root master

edit b3065c9 Project setup
pick 873641e Added foo.js
pick 1d5f372 Updated foo.js

After starting the rebase, our HEAD will point to the first commit, b3065c9. From there, we can fix our typo, and then amend the commit:


vim README.md
git add README.md
git commit --amend

Our HEAD is still pointing to the first commit, but now our fixed typo is included in the set of changes!

We can continue our rebase and go about our business, pretending that the typo never existed.


git rebase --continue

With Great Power

Remember young Time Lord, with great power comes great responsibility.

Tampering with revision history can lead to serious losses for your project if done incorrectly. It’s recommended that you practice any changes you plan to make in another branch before attempting them in master. Another fallback is to reset hard to origin/master if all goes wrong:


git reset --hard origin/master

While changing history can be dangerous, it’s a very useful skill to have. When you want your history to be the main artifact of your work, it pays to ensure it’s as polished and perfected as possible.

Phoenix Todos - The User Model

This post is written as a set of Literate Commits. The goal of this style is to show you how this program came together from beginning to end.

Each commit in the project is represented by a section of the article. Click each section's header to see the commit on Github, or check out the repository and follow along.

Written by Pete Corey on Sep 7, 2016.

Create Users Table

Let’s focus on adding users and authorization to our Todos application. The first thing we’ll need is to create a database table to hold our users and a corresponding users schema.

Thankfully, Phoenix comes with many generators that ease the process of creating things like migrations and models.

To generate our users migration, we’ll run the following mix command:


mix phoenix.gen.model User users email:string encrypted_password:string

We’ll modify the migration file the generator created for us and add NOT NULL restrictions on both the email and encrypted_password fields:


add :email, :string, null: false
add :encrypted_password, :string, null: false

We’ll also add an index on the email field for faster queries:


create unique_index(:users, [:email])

Great! Now we can run that migration with the mix ecto.migrate command.

priv/repo/migrations/20160901141548_create_user.exs

+defmodule PhoenixTodos.Repo.Migrations.CreateUser do + use Ecto.Migration + + def change do + create table(:users) do + add :email, :string, null: false + add :encrypted_password, :string, null: false + + timestamps + end + + create unique_index(:users, [:email]) + end +end

Creating the Users Model

Now that we’re created our users table, we need to create a corresponding User model. Phoenix actually did most of the heavy lifting for us when we ran the mix phoenix.gen.model command.

If we look in /web/models, we’ll find a user.ex file that holds our new User model. While the defaults generated for us are very good, we’ll need to make a few tweaks.

In addition to the :email and :encrypted_password fields, we’ll also need a virtual :password field.


field :password, :string, virtual: true

:password is virtual because it will be required by our changeset function, but will not be stored in the database.

Speaking of required fields, we’ll need to update our @required_fields and @optional_fields attributes to reflect the changes we’ve made:


@required_fields ~w(email password)
@optional_fields ~w(encrypted_password)

These changes to @required_fields break our auto-generated tests against the User model. We’ll need to update the @valid_attrs attribute in test/models/user_test.ex and replace :encrypted_password with :password:


@valid_attrs %{email: "user@example.com", password: "password"}

And with that, our tests flip back to green!

test/models/user_test.exs

+defmodule PhoenixTodos.UserTest do + use PhoenixTodos.ModelCase + + alias PhoenixTodos.User + + @valid_attrs %{email: "user@example.com", password: "password"} + @invalid_attrs %{} + + test "changeset with valid attributes" do + changeset = User.changeset(%User{}, @valid_attrs) + assert changeset.valid? + end + + test "changeset with invalid attributes" do + changeset = User.changeset(%User{}, @invalid_attrs) + refute changeset.valid? + end +end

web/models/user.ex

+defmodule PhoenixTodos.User do + use PhoenixTodos.Web, :model + + schema "users" do + field :email, :string + field :password, :string, virtual: true + field :encrypted_password, :string + + timestamps + end + + @required_fields ~w(email password) + @optional_fields ~w(encrypted_password) + + @doc """ + Creates a changeset based on the `model` and `params`. + + If no params are provided, an invalid changeset is returned + with no validation performed. + """ + def changeset(model, params \\ :empty) do + model + |> cast(params, @required_fields, @optional_fields) + end +end

Additional Validation

While the default required/optional field validation is a good start, we know that we’ll need additional validations on our User models.

For example, we don’t want to accept email addresses without the "@" symbol. We can write a test for this in our UserTest module:


test "changeset with invalid email" do
  changeset = User.changeset(%User{}, %{
    email: "no_at_symbol",
    password: "password"
  })
  refute changeset.valid?
end

Initially this test fails, but we can quickly make it pass by adding basic regex validation to the :email field in our User.changeset function:


|> validate_format(:email, ~r/@/)

We can repeat this process for all of the additional validation we need, like checking password length, and asserting email uniqueness.

test/models/user_test.exs

... - alias PhoenixTodos.User + alias PhoenixTodos.{User, Repo} ... end + + test "changeset with invalid email" do + changeset = User.changeset(%User{}, %{ + email: "no_at_symbol", + password: "password" + }) + refute changeset.valid? + end + + test "changeset with short password" do + changeset = User.changeset(%User{}, %{ + email: "email@example.com", + password: "pass" + }) + refute changeset.valid? + end + + test "changeset with non-unique email" do + User.changeset(%User{}, %{ + email: "email@example.com", + password: "password", + encrypted_password: "encrypted" + }) + |> Repo.insert! + + assert {:error, _} = User.changeset(%User{}, %{ + email: "email@example.com", + password: "password", + encrypted_password: "encrypted" + }) + |> Repo.insert + end end

web/models/user.ex

... |> cast(params, @required_fields, @optional_fields) + |> validate_format(:email, ~r/@/) + |> validate_length(:password, min: 5) + |> unique_constraint(:email, message: "Email taken") end

Hashing Our Password

You might have noticed that we had to manually set values for the encrypted_password field for our "changeset with non-unique email" test to run. This was to prevent the database from complaining about a non-null constraint violation.

Let’s remove those lines from our test and generate the password hash ourselves!

:encrypted_password was an unfortunate variable name choice. Our password is not being encrypted and stored in the database; that would be insecure. Instead we're storing the hash of the password.

We’ll use the comeonin package to hash our passwords, so we’ll add it as a dependency and an application in mix.exs:


def application do
  [...
   applications: [..., :comeonin]]
end

defp deps do
  [...
   {:comeonin, "~> 2.0"}]
end

Now we can write a private method that will update the our :encrypted_password field on our User model if its given a valid changeset that’s updating the value of :password:


defp put_encrypted_password(changeset = %Ecto.Changeset{
  valid?: true,
  changes: %{password: password}
}) do
  changeset
  |> put_change(:encrypted_password, Comeonin.Bcrypt.hashpwsalt(password))
end

We’ll use pattern matching to handle the cases where a changeset is either invalid, or not updating the :password field:


defp put_encrypted_password(changeset), do: changeset

Isn’t that pretty? And with that, our tests are passing once again.

mix.exs

... applications: [:phoenix, :phoenix_html, :cowboy, :logger, :gettext, - :phoenix_ecto, :postgrex]] + :phoenix_ecto, :postgrex, :comeonin]] end ... {:cowboy, "~> 1.0"}, - {:mix_test_watch, "~> 0.2", only: :dev}] + {:mix_test_watch, "~> 0.2", only: :dev}, + {:comeonin, "~> 2.0"}] end

mix.lock

-%{"connection": {:hex, :connection, "1.0.4"}, +%{"comeonin": {:hex, :comeonin, "2.5.2"}, + "connection": {:hex, :connection, "1.0.4"}, "cowboy": {:hex, :cowboy, "1.0.4"},

test/models/user_test.exs

... email: "email@example.com", - password: "password", - encrypted_password: "encrypted" + password: "password" }) ... email: "email@example.com", - password: "password", - encrypted_password: "encrypted" + password: "password" })

web/models/user.ex

... |> unique_constraint(:email, message: "Email taken") + |> put_encrypted_password end + + defp put_encrypted_password(changeset = %Ecto.Changeset{ + valid?: true, + changes: %{password: password} + }) do + changeset + |> put_change(:encrypted_password, Comeonin.Bcrypt.hashpwsalt(password)) + end + defp put_encrypted_password(changeset), do: changeset end

Final Thoughts

Things are starting to look very different from our original Meteor application. While Meteor tends to hide complexity from application developers by withholding code in the framework itself, Phoenix expects developers to write much of this boilerplate code themselves.

While Meteor’s methodology lets developers get off the ground quickly, Phoenix’s philosophy of hiding nothing ensures that there’s no magic in the air. Everything works just as you would expect; it’s all right in front of you!

Additionally, Phoenix generators ease most of the burden of creating this boilerplate code.

Now that our User model is in place, we’re in prime position to wire up our front-end authorization components. Check back next week to see those updates!