SaltyCrane Blog — Notes on JavaScript and web development

How to useMemo to improve the performance of a React table

These are notes on how I improved the React rendering time of a large table with useMemo, the React Devtools, and consideration of referential equality of my data.

Problem

I had a table of 100 rows of select inputs. Changing a select input had a noticeable lag.

react table screenshot

React Profiler

I profiled the table with the Profiler in React Devtools and found that all the rows were re-rendering even though only one of them changed. The screenshot below shows rendering of my Table component took 239ms. All the colored bars beneath the Table means each of the 100 rows are rendering even though only one of them changed. For more information, see this article on how to use the React Profiler.

react table screenshot

Row component

The table was built using React hooks and I sprinkled useMemo liberally in my code. Most of my data was memoized, but React was still re-rendering. Here is my row component:

const MappingRow = ({ id }) => {
  // ...
  const mapping = useMapping(state, id);
  const enabled = useEnabledFields(state, id);
  const { makeOptions, modelFamilyOptions, modelParentOptions, segmentOptions } = useMappingRowApis(id);

  const handleChange = field => selected => {
    const value = selected && selected.value;
    if (value === mapping[field]) {
      return;
    }
    const update = { [field]: value };
    dispatch({
      type: "save_mapping",
      promise: saveMapping(id, update),
      id,
      timeSaved: Date.now(),
      update,
    });
  };

  return (
    <tr>
      <Cell>{mapping.source}</Cell>
      <SelectCell
        isDisabled={!enabled.mappableMakeName}
        onChange={handleChange("makeCode")}
        options={makeOptions}
        value={mapping.makeCode}
      />
      <SelectCell
        isDisabled={!enabled.mappableModelParent}
        onChange={handleChange("modelParentId")}
        options={modelParentOptions}
        value={mapping.modelParentId}
      />
      <SelectCell
        isDisabled={!enabled.mappableModelFamilyName}
        onChange={handleChange("modelFamilyName")}
        options={modelFamilyOptions}
        value={mapping.modelFamilyName}
      />
      <SelectCell
        isDisabled={!enabled.mappableSegmentName}
        onChange={handleChange("segmentCode")}
        options={segmentOptions}
        value={mapping.segmentCode}
      />
    </tr>
  );
};

memo HOC

Even though the data provided by my custom hooks was memoized, I realized I still needed to apply React's memo higher order component (HOC) to prevent re-rendering. I extracted out a new MemoizedRow component, so that I could wrap it with React's memo HOC. (Note: if this seems undesirable to you, see the end of this post.)

const MappingRow = ({ id }) => {
  // ...
  const mapping = useMapping(state, id);
  const enabled = useEnabledFields(state, id);
  const { makeOptions, modelFamilyOptions, modelParentOptions, segmentOptions } = useMappingRowApis(id);
  const handleChange = field => selected => {
    // ...
  };

  return (
    <MemoizedRow
      enabled={enabled}
      handleChange={handleChange}
      makeOptions={makeOptions}
      mapping={mapping}
      modelFamilyOptions={modelFamilyOptions}
      modelParentOptions={modelParentOptions}
      segmentOptions={segmentOptions}
    />
  );
};

const MemoizedRow = memo(props => {
  const {
    enabled,
    handleChange,
    makeOptions,
    mapping,
    modelFamilyOptions,
    modelParentOptions,
    segmentOptions,
    sourceConfig,
  } = props;
  return (
    <tr>
      <Cell>{mapping.source}</Cell>
      <SelectCell
        isDisabled={!enabled.mappableMakeName}
        onChange={handleChange("makeCode")}
        options={makeOptions}
        value={mapping.makeCode}
      />
      <SelectCell
        isDisabled={!enabled.mappableModelParent}
        onChange={handleChange("modelParentId")}
        options={modelParentOptions}
        value={mapping.modelParentId}
      />
      <SelectCell
        isDisabled={!enabled.mappableModelFamilyName}
        onChange={handleChange("modelFamilyName")}
        options={modelFamilyOptions}
        value={mapping.modelFamilyName}
      />
      <SelectCell
        isDisabled={!enabled.mappableSegmentName}
        onChange={handleChange("segmentCode")}
        options={segmentOptions}
        value={mapping.segmentCode}
      />
    </tr>
  );
});

Referential equality or shallow equality

I applied the memo HOC, but profiling showed no change in performance. I thought I should useWhyDidYouUpdate. This revealed some of my props were not equal when I expected them to be. One of them was my handleChange callback function. This function is created every render. The reference to the function from one render does not compare as equal to the reference to the function in another render. Wrapping this function with useCallback memoized the function so it will compare equally unless one of the dependencies change (mapping or id).

const MappingRow = ({ id }) => {
  //...

  const handleChange = useCallback(
    field => selected => {
      const value = selected && selected.value;
      if (value === mapping[field]) {
        return;
      }
      const update = { [field]: value };
      dispatch({
        type: "save_mapping",
        promise: saveMapping(id, update),
        id,
        timeSaved: Date.now(),
        update,
      });
    },
    [mapping, id],
  );

  return (
    <MemoizedRow
      enabled={enabled}
      handleChange={handleChange}
      makeOptions={makeOptions}
      mapping={mapping}
      modelFamilyOptions={modelFamilyOptions}
      modelParentOptions={modelParentOptions}
      segmentOptions={segmentOptions}
    />
  );
};

Another problem was my mapping data object was changing for every row even though I only actually changed one of the rows. I was using the Immer library to create immutable data structures. I had learned that using immutable data structures allows updating a slice of data in an object without changing the reference to a sibling slice of data so that it would compare equally when used with the memo HOC or PureComponent. I had thought my data was properly isolated and memoized, however there was one piece of my state that was breaking the memoization. Here is my code to return a single mapping data object for a row:

export const useMapping = (state, id) => {
  const {
    optimisticById,
    optimisticIds,
    readonlyById,
    writableById,
  } = state.mappings;
  const optimisticMapping = optimisticById[id];
  const readonlyMapping = readonlyById[id];
  const writableMapping = writableById[id];
  return useMemo(() => {
    const mapping = { ...readonlyMapping, ...writableMapping };
    return optimisticIds.includes(id)
      ? { ...mapping, ...optimisticMapping }
      : mapping;
  }, [id, optimisticIds, optimisticMapping, readonlyMapping, writableMapping]);
};

The optimisticIds state was used to store a list of ids of mapping items that had been updated by the user, but had not yet been saved to the database. This list changed whenever a row was edited, but it was used in creating the mapping data for every row in the table. The optimisticIds is in the useMemo dependency array, so when it changes, the mapping data is re-calculated and a new value is returned. The important part is not that running the code in this function is expensive. The important part is that the function returns a newly created object literal. Like the handleChange function created in the component above, object literals created at different times do not compare equally even if the contents of the object are the same. e.g. The following is not true in JavaScript: {} === {}. I realized I did not need the optimisticIds state, so I removed it. This left a memoized function that only recalculated when data for its corresponding row in the table changed:

export const useMapping = (state, id) => {
  const { optimisticById, readonlyById, writableById } = state.mappings;
  const optimisticMapping = optimisticById[id];
  const readonlyMapping = readonlyById[id];
  const writableMapping = writableById[id];
  return useMemo(() => {
    const mapping = { ...readonlyMapping, ...writableMapping };
    return optimisticMapping ? { ...mapping, ...optimisticMapping } : mapping;
  }, [optimisticMapping, readonlyMapping, writableMapping]);
};

20X improvement

After fixing these referential inequalities, the memo HOC eliminated the re-rendering of all but the edited row. The React profiler now showed the table rendered in 10ms, a 20X improvement.

react table screenshot

Refactoring to useMemo

To use the memo HOC, I had to extract out a separate component for the sole purpose of applying the memo HOC. I started to convert the HOC to a render prop so I could use it inline. Then I thought, aren't hooks supposed to replace most HOCs and render props? Someone should make a useMemo hook to do what the memo HOC does. Wait there is a useMemo hook already... I wonder if...

const MappingRow = ({ id }) => {
  const mapping = useMapping(id);
  const enabled = useEnabledFields(id);
  const { makeOptions, modelFamilyOptions, modelParentOptions, segmentOptions } = useMappingRowApis(id);

  const handleChange = useCallback(
    field => selected => {
      const value = selected && selected.value;
      if (value === mapping[field]) {
        return;
      }
      const update: MappingUpdate = { [field]: value };
      dispatch({
        type: "save_mapping",
        promise: saveMapping(id, update),
        id,
        timeSaved: Date.now(),
        update,
      });
    },
    [dispatch, mapping, id],
  );

  return useMemo(
    () => (
      <tr>
        <Cell>{mapping.source}</Cell>
        <SelectCell
          isDisabled={!enabled.mappableMakeName}
          onChange={handleChange("makeCode")}
          options={makeOptions}
          value={mapping.makeCode}
        />
        <SelectCell
          isDisabled={!enabled.mappableModelParent}
          onChange={handleChange("modelParentId")}
          options={modelParentOptions}
          value={mapping.modelParentId}
        />
        <SelectCell
          isDisabled={!enabled.mappableModelFamilyName}
          onChange={handleChange("modelFamilyName")}
          options={modelFamilyOptions}
          value={mapping.modelFamilyName}
        />
        <SelectCell
          isDisabled={!enabled.mappableSegmentName}
          onChange={handleChange("segmentCode")}
          options={segmentOptions}
          value={mapping.segmentCode}
        />
      </tr>
    ),
    [
      enabled,
      handleChange,
      mapping,
      makeOptions,
      modelParentOptions,
      modelFamilyOptions,
      segmentOptions,
    ],
  );
};

Yes applying useMemo to the returned JSX element tree had the same effect as applying the memo HOC without the intrusive component refactor. I thought that was pretty cool. Dan Abramov tweeted about wrapping React elements with useMemo also:

New Mac setup notes 2019

install Homebrew

$ /usr/bin/ruby -e "$(curl -fsSL https://raw.githubusercontent.com/Homebrew/install/master/install)"

brew cask install stuff

$ brew cask install iterm2
$ brew cask install emacs
$ brew cask install kdiff3

brew install stuff

$ brew install bat
$ brew install exa
$ brew install fzf
$ brew install glances
$ brew install gnupg
$ brew install icdiff
$ brew install markdown
$ brew install node@10
$ brew install postgresql
$ brew install python
$ brew install rbenv
$ brew install readline
$ brew install ripgrep
$ brew install screen
$ brew install tldr

copy SSH keys

set up homedir

$ cd /tmp
$ git clone git@github.com:saltycrane/homedir.git
$ rsync -avz homedir/ ~/

set up spacemacs

See https://github.com/saltycrane/.spacemacs.d#usage

install stuff via websites

misc stuff

  • fix the Meta/Alt/Option key in iTerm
    iTerm2 > Preferences > Profiles > Keys > Change Option key to "Esc+"
  • set a directory for screenshots
    $ defaults write com.apple.screencapture location /Users/eliot/Pictures 
    
  • allow gpg decrypting in emacs
    $ brew install pinentry-mac
    $ echo "pinentry-program /usr/local/bin/pinentry-mac" >> ~/.gnupg/gpg-agent.conf
    
  • python2 and pycrypto setup
    $ brew install python2
    $ pip install virtualenv
    $ brew install gmp # for pycrypto
    

see also

How to run PostgreSQL in Docker on Mac (for local development)

These are my notes for running Postgres in a Docker container for use with a local Django or Rails development server running on the host machine (not in Docker). Running in Docker allows keeping my database environment isolated from the rest of my system and allows running multiple versions and instances. (I previously had a problem where Homebrew upgraded Postgres when I didn't expect it to and my existing database became incompatible. Admittedly, I didn't know Homebrew well, but it was frustrating.) Disadvantages of Docker are it's another layer of abstraction to learn and interact with. We use Docker extensively at work, so from a mental overhead point of view, it's something I wanted to learn anyways. Currently I use the Homebrew Postgres for work, and Postgres in Docker for personal projects. I also wrote some notes on Postgres and Homebrew here.

Install Docker

OPTION 1: Run Postgres using a single Docker command

Run a postgres container
  • uses the official docker postgres 11 image
  • uses a volume named my_dbdata to store postgres data
  • exposes port 54320 to the host
  • sets the container name to my_postgres
  • uses the -d flag to run in the background
$ docker run -d --name my_postgres -v my_dbdata:/var/lib/postgresql/data -p 54320:5432 postgres:11

OPTION 2: Run Postgres using Docker Compose

Create a docker-compose.yml file
$ mkdir /tmp/myproject
$ cd /tmp/myproject

Create a new file docker-compose.yml:

version: "3"
services:
  db:
    image: "postgres:11"
    container_name: "my_postgres"
    ports:
      - "54320:5432"
    volumes:
      - my_dbdata:/var/lib/postgresql/data
volumes:
  my_dbdata:
  • uses docker compose file version 3
  • sets up a service named "db" (this name can be used with docker-compose commands)
  • uses the postgres:11 image from hub.docker.com
  • creates a container named "my_postgres"
  • connects port 5432 inside Docker as port 54320 on the host machine
  • uses a volume named "my_dbdata" for storing the database data. Even if the container and image are deleted, the volume will remain unless explicitly deleted using docker volume rm
  • for more information, see the Docker Compose file reference
Start Postgres

Pull the postgres image from hub.docker.com, create a container named "my_postgres", and start it in the background:

$ docker-compose up -d

See that it's working

See the logs:

$ docker logs -f my_postgres

Try running psql:

$ docker exec -it my_postgres psql -U postgres

hit CTRL+D to exit

For other commands such as starting, stopping, listing or deleting, see my Docker cheat sheet.

Create a database

$ docker exec -it my_postgres psql -U postgres -c "create database my_database"

Connect using Python and psycopg2

$ python3.6 -m venv myenv
$ source myenv/bin/activate
$ pip install psycopg2-binary

Create a new file named myscript.py

import psycopg2

conn = psycopg2.connect(
    host='localhost',
    port=54320,
    dbname='my_database',
    user='postgres',
)
cur = conn.cursor()
cur.execute("CREATE TABLE IF NOT EXISTS test (id serial PRIMARY KEY, num integer, data varchar);")
cur.execute("INSERT INTO test (num, data) VALUES (%s, %s)", (100, "abcdef"))
cur.execute("SELECT * FROM test;")
result = cur.fetchone()
print(result)
conn.commit()
cur.close()
conn.close()

Run it

$ python myscript.py
(1, 100, 'abcdef')

See also

Magit in Spacemacs (evil-magit) notes

Magit and Org are two killer apps for Emacs. Here are my Magit notes using (the also excellent) Spacemacs (which uses evil-magit).

Contents

Show git status

  • SPC g s show Magit status view

Show help

  • SPC g s show Magit status view
  • ? get help

Show git log

  • SPC g s show Magit status view
  • l l show log view

Show all commits for the current file 1

  • SPC g f show git log for the current file

Diff a range of commits

  • SPC g s show Magit status view
  • l l show log view
  • use j and k to position the cursor on a commit
  • V to select the line
  • use j and k to position the cursor on another commit
  • d r to show a diff of the range of commits

Checkout a local branch

  • SPC g s show Magit status view
  • b b checkout a branch
  • select or enter the branch name and hit ENTER

Checkout a commit

  • SPC g s show Magit status view
  • l l show log view
  • use j and k to position the cursor on a commit
  • b b ENTER to checkout that commit

Checkout a different revision of a file

  • SPC g s show Magit status view
  • l l show log view
  • move point to the commit you want to checkout (using j and k)
  • O (capital letter O) f reset a file
  • hit ENTER to select the default revision selected above. (it will look something like master~4)
  • select a file
  • q to close the log view and see the file at the selected revision is staged

Create a local branch from a remote branch

  • SPC g s show Magit status view
  • b c create a branch
  • select or enter the remote branch and hit ENTER
  • hit ENTER to use the same name or enter a new name and hit ENTER

Pull from upstream

  • SPC g s show Magit status view
  • F u pull from upstream

Push to upstream

  • SPC g s show Magit status view
  • P u push to upstream

Stage files and commit

  • SPC g s show Magit status view
  • use j and k to position the cursor on a file
  • TAB to show and hide the diff for the file
  • s to stage a file (u to unstage a file and x to discard changes to a file)
  • c c to commit
  • write a commit message and save with SPC f s
  • , c to finish the commit message

Stage specific hunks 2

  • SPC g s show Magit status view
  • M-n / M-p to move to the "Unstaged changes" section
  • j / k to move to the desired file
  • TAB to expand the hunks in the file
  • M-n / M-p to move to different hunks
  • s / u to stage or unstange hunks
  • x to discard a hunk
  • c c to commit
  • Enter a commit message and save with SPC f s
  • , c to finish the commit

Merge master into the current branch

  • SPC g s show Magit status view
  • m m merge
  • select or enter master and hit ENTER

Rebase the current branch onto master

  • SPC g s show Magit status view
  • r e rebase
  • select or enter master and hit ENTER

Use interactive rebase to squash commits

  • SPC g s show Magit status view
  • l l show log view
  • use j and k to position the cursor on a commit
  • r i to start the interactive rebase
  • use j and k to position the cursor on a commit to squash
  • s to mark the commit as to be squashed. (use s multiple times to squash multiple commits.)
  • , c to make it happen
  • edit the new squashed commit message and save with SPC f s
  • , c to finish

Use interactive rebase to reorder commits

  • SPC g s show Magit status view
  • l l show log view
  • use j and k to position the cursor on a commit
  • ri to start the interactive rebase
  • use j and k to position the cursor on a commit to reorder
  • use M-k or M-j to move the commit up or down
  • , c to make it happen

Revert a commit

  • SPC g s show Magit status view
  • l l show log view
  • use j and k to position the cursor on the commit you want to revert
  • _ O (capital letter O) to revert the commit
  • edit the commit message and save with SPC f s
  • , c to finish

(Soft) reset the last commit

  • SPC g s show Magit status view
  • l l show log view
  • use j and k to position the cursor one commit before the last one
  • O (capital letter O) s to soft reset
  • the selected commit should be e.g. master~1. Hit ENTER

Stash changes

  • SPC g s show Magit status view
  • z z stash changes
  • enter stash message and hit ENTER

Pop stash

  • SPC g s show Magit status view
  • z p pop from stash
  • select the stash to pop and hit ENTER

Copy git commit SHA 3

  • SPC g s show Magit status view
  • l l show log view
  • use j and k to position the cursor on a commit
  • y s copy the git commit SHA

Copy text from a Magit buffer 4

  • SPC g s show Magit status view
  • \ switch to text mode
  • copy text using normal vim keystrokes
  • \ switch back to Magit mode

Run a shell command 5

  • SPC g s show Magit status view
  • ! s run a shell command
  • enter a command to run and hit ENTER

List all branches 6

  • SPC g s show Magit status view
  • y r show refs

Jump to the next/prev section in the status view 7

  • SPC g s show Magit status view
  • g j jump to the next section
  • g k jump to the previous section

References

  1. https://twitter.com/iLemming/status/1058507342830923776
  2. https://github.com/emacs-evil/evil-magit
  3. https://twitter.com/a_simpson/status/749316494224265216
  4. https://twitter.com/iLemming/status/986074309234802688
  5. https://twitter.com/_wilfredh/status/689955624080248833
  6. https://emacs.stackexchange.com/a/27148
  7. https://www.youtube.com/watch?v=j-k-lkilbEs

Caching a filtered list of results w/ Redux, React Router, and redux-promise-memo

This post shows how to cache API data for a React + Redux application using ideas from my library, redux-promise-memo. The example app displays a filtered list of vehicles, a sidebar with make and model filters, and a detail page for each vehicle. Caching is used to prevent re-fetching API data when navigating between the detail pages and the list page.

In addition to React and Redux, this example uses React Router and redux-promise-middleware, though alternatives like redux-pack or Gluestick's promise middleware can be used also.

The example is broken into 3 sections: 1. Basic features (no caching), 2. Caching (manual setup), and 3. Caching with redux-promise-memo.

Basic features
  • filtering vehicles by make and model is done by the backend vehicles API
  • when a make is selected, the models API populates the models filter for the selected make
  • filter parameters (make and model) are stored in the URL query string to support deep linking to a page of filtered results and to support browser "back" and "forward" navigation
  • each vehicle detail page also has a unique route using the vehicle id in the route
Highlighted feature - caching
  • API responses are not re-fetched when moving back and forward between pages

Code for basic features (no caching)

The full example code is here on github. A demo is deployed here.

VehicleFilters.js
class VehiclesFilters extends React.Component {
  componentDidMount() {
    this._fetchData();
  }

  componentDidUpdate(prevProps) {
    if (this.props.query !== prevProps.query) {
      this._fetchData();
    }
  }

  _fetchData() {
    let { dispatch, query } = this.props;
    dispatch(actions.fetchModels(query.make));
  }

  render() {
    let { changeQuery, models, query } = this.props;
    return (
      <Container>
        <Select
          label="Make"
          onChange={e =>
            changeQuery({ make: e.currentTarget.value, model: "" })
          }
          options={["All makes", "Acura", "BMW", "Cadillac", "..."]}
          value={query.make || ""}
        />
        {query.make && (
          <Select
            label="Model"
            onChange={e => changeQuery({ model: e.currentTarget.value })}
            options={["All models", ...models]}
            value={query.model || ""}
          />
        )}
      </Container>
    );
  }
}

export default compose(
  withVehiclesRouter,
  connect(state => ({ models: state.models }))
)(VehiclesFilters);

The VehicleFilters component has <select> inputs for the "Make" and "Model" filters.

  1. when a user changes the make to "BMW", the changeQuery function adds the ?make=BMW query string to the URL
  2. when the query string is updated, componentDidUpdate calls fetchModels which calls the models API
  3. when the API responds, the models list is stored in Redux at state.models
  4. when the Redux state changes, the "Model" <select> is updated with the new list of models

Notes:

  • withVehiclesRouter is a higher-order component that adds the following props to VehicleFilters:

    • query - the parsed query string
    • changeQuery - a function used to update the query string

    See the implementation here

  • the route is the single source of truth for the make and model parameters. They are stored only in the route and not in Redux.

VehiclesList.js
class VehiclesList extends React.Component {
  componentDidMount() {
    this._fetchData();
  }

  componentDidUpdate(prevProps) {
    if (this.props.query !== prevProps.query) {
      this._fetchData();
    }
  }

  _fetchData() {
    let { dispatch, query } = this.props;
    dispatch(actions.fetchVehicles(query));
  }

  render() {
    let { isLoading, vehicles } = this.props;
    return (
      <Container>
        {isLoading ? (
          <Spinner />
        ) : (
          vehicles.map(vehicle => (
            <Link key={vehicle.id} to={`/vehicles/${vehicle.id}`}>
              <VehicleCard {...vehicle} />
            </Link>
          ))
        )}
      </Container>
    );
  }
}

export default compose(
  withVehiclesRouter,
  connect(state => ({
    isLoading: state.isLoading,
    vehicles: state.vehicles
  }))
)(VehiclesList);

The VehiclesList component gets data the same way the "Model" <select> input does.

  1. the previous VehicleFilters component updates the route query string with a make or model
  2. when the route query string is updated, this component's componentDidUpdate calls fetchVehicles which calls the vehicles API
  3. when the API responds, the vehicle list is stored in Redux at state.vehicles
  4. when the Redux state changes, this component is updated with the new list of vehicles.

Each vehicle card is wrapped with a react-router <Link>. Clicking the vehicle card navigates to a new route, /vehicles/{vehicleId}.

VehicleDetail.js
class VehicleDetail extends React.Component {
  componentDidMount() {
    let { dispatch, vehicleId } = this.props;
    dispatch(actions.fetchVehicle(vehicleId));
  }

  render() {
    let { isLoading, vehicle } = this.props;
    return isLoading ? <Spinner /> : <VehicleCard {...vehicle} />;
  }
}

export default compose(
  withVehiclesRouter,
  connect(state => ({
    isLoading: state.isLoading,
    vehicle: state.vehicle
  }))
)(VehicleDetail);

VehicleDetail gets data in the same way as the "Model" filter and VehiclesList. One difference is that it doesn't need to use componentDidUpdate because the API input parameter (vehicleId) never changes.

  1. to display a vehicle detail page, a vehicle <Link> is clicked in the previous VehiclesList component which changes the route to /vehicles/{vehicleId}
  2. when the route changes, this component is rendered and passed the vehicleId prop.
  3. when this component is rendered, componentDidMount calls fetchVehicle which calls the vehicle detail API

The withVehiclesRouter higher-order component takes match.params.vehicleId from react-router and passes it to VehicleDetail as vehicleId.

App.js
let store = createStore(reducer, applyMiddleware(promiseMiddleware()));

let VehiclesPage = () => (
  <React.Fragment>
    <VehiclesFilters />
    <VehiclesList />
  </React.Fragment>
);

let App = () => (
  <Provider store={store}>
    <BrowserRouter>
      <Switch>
        <Route component={VehicleDetail} path="/vehicles/:vehicleId" />
        <Route component={VehiclesPage} path="/vehicles" />
      </Switch>
    </BrowserRouter>
  </Provider>
);

This shows react-router route configuration for the app and also the addition of redux-promise-middleware.

actions.js
export let fetchModels = make => ({
  type: "FETCH_MODELS",
  payload: fakeModelsApi(make)
});

export let fetchVehicle = vehicleId => ({
  type: "FETCH_VEHICLE",
  payload: fakeVehicleApi(vehicleId)
});

export let fetchVehicles = params => ({
  type: "FETCH_VEHICLES",
  payload: fakeVehiclesApi(params)
});
reducers.js
let isLoading = (state = false, action) => {
  switch (action.type) {
    case "FETCH_VEHICLE_PENDING":
    case "FETCH_VEHICLES_PENDING":
      return true;
    case "FETCH_VEHICLE_FULFILLED":
    case "FETCH_VEHICLES_FULFILLED":
      return false;
    default:
      return state;
  }
};

let models = (state = [], action) => {
  switch (action.type) {
    case "FETCH_MODELS_FULFILLED":
      return action.payload;
    default:
      return state;
  }
};

let vehicle = (state = [], action) => {
  switch (action.type) {
    case "FETCH_VEHICLE_FULFILLED":
      return action.payload;
    default:
      return state;
  }
};

let vehicles = (state = [], action) => {
  switch (action.type) {
    case "FETCH_VEHICLES_FULFILLED":
      return action.payload;
    default:
      return state;
  }
};

export default combineReducers({
  isLoading,
  models,
  vehicle,
  vehicles
});

These are the actions and reducers that are using redux-promise-middleware.

  • fakeModelsApi, fakeVehicleApi, and fakeVehiclesApi are meant to mimick a HTTP client like fetch or axios. They return a promise that resolves with some canned data after a 1 second delay.
  • the models, vehicle, and vehicles reducers store the API responses in the Redux state

Caching (manual setup)

In the above setup, API calls are made unecessarily when navigating back and forth between vehicle details pages and the main vehicles list. The goal is to eliminate the uncessary calls.

The vehicle data is already "cached" in Redux. But the app needs to be smarter about when to fetch new data. In some apps, a component can check if API data exisits in Redux and skip the API call if it is present. In this case, however, doing this would not allow updating the results when the make or model filters change.

The approach I took in redux-promise-memo was to fetch only when API input parameters changed:

  • the API input parameters (e.g. make, model) are stored in Redux
  • when deciding whether to make a new API call, the current API input paramters are tested to see if they match the parameters previously stored in Redux
  • if they match, the API call is skipped, and the data already stored in Redux is used

Below are changes that can be made to implement this idea without using the library. The solution with the redux-promise-memo library is shown at the end.

The full example code is here on github. A demo is deployed here.

actions.js
export let fetchVehicles = params => ({
  type: "FETCH_VEHICLES",
  payload: fakeVehiclesApi(params),
  meta: { params } // <= NEW: add the API params to the action
});
reducers.js
// NEW: add this vehiclesCacheParams reducer
let vehiclesCacheParams = (state = null, action) => {
  switch (action.type) {
    case "FETCH_VEHICLES_FULFILLED":
      return action.meta.params;
    default:
      return state;
  }
};

The actions and reducers are updated to store the API parameters.

  • in the fetchVehicles action creator, the API params is added to the action
  • a new reducer, vehiclesCacheParams, is added which stores those params in Redux when the API succeeds
VehiclesList.js
class VehiclesList extends React.Component {
  _fetchData() {
    let { cacheParams, dispatch, query } = this.props;

    // NEW: add this "if" statement to check if API params have changed
    if (JSON.stringify(query) !== JSON.stringify(cacheParams)) {
      dispatch(actions.fetchVehicles(query));
    }
  }

  // ...the rest is the same as before
}

export default compose(
  withVehiclesRouter,
  connect(state => ({
    cacheParams: state.vehiclesCacheParams, // <= NEW line here
    isLoading: state.isLoading,
    vehicles: state.vehicles
  }))
)(VehiclesList);
  • in VehiclesList, the vehicleCacheParams state is added to the connect call
  • then an "if" condition is added around the fetchVehicles action dispatch. JSON.stringify is used to compare arguments that are objects or arrays.

Caching using redux-promise-memo

To avoid adding this boilerplate for every API, I abstracted the above idea into the redux-promise-memo library.

  • it uses a reducer to store the API input parameters like the manual solution
  • it provides a memoize function to wrap promise-based action creators (like those created with redux-promise-middleware above)
  • the memoize wrapper checks if the API input arguments have changed. If the input arguments match what is stored in Redux, the API call is skipped.
  • it also stores the "loading" and "success" state of the API. It skips the API call if the previous API call is loading or sucessful. But it re-fetches if there was an error.
  • it has an option to support multiple caches per API. If data is stored in a different places in Redux per set of input arguments, this option can be used.
  • it provides support for "invalidating" the cache using any Redux actions.
  • it provides support for other libraries besides redux-promise-middleware. Custom matchers for other libraries can be written. Examples of matchers are here

The full example code is here on github. A demo is deployed here.

Install redux-promise-memo

redux-promise-memo uses redux-thunk so it needs to be installed as well.

npm install redux-promise-memo redux-thunk
index.js
import thunk from "redux-thunk";

let store = createStore(reducer, applyMiddleware(thunk, promiseMiddleware()));

Add redux-thunk to redux middleware

reducers.js
// NEW: remove the vehiclesCacheParams reducer

// NEW: add the _memo reducer
import {
  createMemoReducer,
  reduxPromiseMiddlewareConfig
} from "redux-promise-memo";

let _memo = createMemoReducer(reduxPromiseMiddlewareConfig);

let rootReducer = combineReducers({
  _memo,
  isLoading,
  models,
  vehicle,
  vehicles
});
  • create the redux-promise-memo reducer and add it to the root reducer. IMPORTANT: the Redux state slice must be named _memo for it to work. I tried to create a Redux enhancer to handle this automatically, but did not get it to work reliably with other libraries.
  • remove the *CacheParams reducers to store API input params from our manual solution
actions.js
import { memoize } from "redux-promise-memo";

let _fetchModels = make => ({
  type: "FETCH_MODELS",
  payload: fakeModelsApi(make)
  // NEW: remove the API params from the action
});
// NEW: wrap the action creator with `memoize`
export let memoizedFetchModels = memoize(_fetchModels, "FETCH_MODELS");

let _fetchVehicle = vehicleId => ({
  type: "FETCH_VEHICLE",
  payload: fakeVehicleApi(vehicleId)
});
export let memoizedFetchVehicle = memoize(_fetchVehicle, "FETCH_VEHICLE");

let _fetchVehicles = params => ({
  type: "FETCH_VEHICLES",
  payload: fakeVehiclesApi(params)
});
export let memoizedFetchVehicles = memoize(_fetchVehicles, "FETCH_VEHICLES");
  • wrap the action creators with the memoize higher order function
  • specify a "key" to be used to separate parameters in the reducer. The action type is recommended to be used as the key.
VehiclesList.js
class VehiclesList extends React.Component {
  componentDidUpdate(prevProps) {
    // NEW: removed "if" condition here
    this._fetchData();
  }

  _fetchData() {
    let { dispatch, query } = this.props;
    // NEW: removed "if" condition here
    dispatch(actions.memoizedFetchVehicles(query));
  }

  // ...the rest is the same as before
}

export default compose(
  withVehiclesRouter,
  connect(state => ({
    // NEW: removed the cacheParams line here
    isLoading: state.isLoading,
    vehicles: state.vehicles
  }))
)(VehiclesList);
  • update components to remove "if" conditions because the library does the check
  • remove the use of state.vehicleCacheParams

This solution should behave similarly to the manual solution with less boilerplate code. In the development environment only, it also logs console messages showing if the API is requesting, loading, or cached.

What does Redux's combineReducers do?

Redux uses a single root reducer function that accepts the current state (and an action) as input and returns a new state. Users of Redux may write the root reducer function in many different ways, but a recommended common practice is breaking up the state object into slices and using a separate sub reducer to operate on each slice of the state. Usually, Redux's helper utility, combineReducers is used to do this. combineReducers is a nice shortcut because it encourages the good practice of reducer composition, but the abstraction can prevent understanding the simplicity of Redux reducers.

The example below shows how a root reducer could be written without combineReducers:

Given a couple of reducers:

function apples(state, action) {
  // do stuff
  return state;
};

function bananas(state, action) {
  // do stuff
  return state;
};

This reducer created with combineReducers:

const rootReducer = combineReducers({ apples, bananas });

is equivalent to this reducer:

function rootReducer(state = {}, action) {
  return {
    apples: apples(state.apples, action),
    bananas: bananas(state.bananas, action),
  };
};

Usage without ES6 concise properties

The above example used ES6 concise properties, but combineReducers can also be used without concise properties. This reducer created with combineReducers:

const rootReducer = combineReducers({
  a: apples,
  b: bananas
});

is equivalent to this reducer:

function rootReducer(state = {}, action) {
  return {
    a: apples(state.a, action),
    b: bananas(state.b, action),
  };
};

Understanding how combineReducers works can be helpful in learning other ways reducers can be used.

References / see also

How to set up a React Apollo client with a Graphcool GraphQL server

Graphcool is a service similar to Firebase except it is used to create GraphQL APIs instead of RESTful ones. Apollo Client is a GraphQL client (alternative to Relay) that can be used with React (and other frontend frameworks). Below is how to create a Graphcool GraphQL service and query it from a React frontend using Apollo Client. I also used Next.js to set up the React project. I am running Node 8.4.0 on macOS Sierra. The code for this example is on github: https://github.com/saltycrane/graphcool-apollo-example.

Jump to: Graphcool setup, Next.js setup, or Apollo setup.

Graphcool GraphQL server

Graphcool Apollo Quickstart: https://www.graph.cool/docs/quickstart/frontend/react/apollo-tijghei9go

  • Install the Graphcool command-line tool (step 2 in quickstart)
    $ npm install -g graphcool-framework 
    
    $ graphcool-framework --version
    graphcool-framework/0.11.5 (darwin-x64) node-v8.4.0 
    
  • Create a Graphcool service (step 3 in quickstart)
    $ cd /tmp
    $ mkdir graphcool-apollo-example
    $ cd graphcool-apollo-example 
    
    $ graphcool-framework init myserver
    Creating a new Graphcool service in myserver... ✔
    
    Written files:
    ├─ types.graphql
    ├─ src
    │  ├─ hello.js
    │  └─ hello.graphql
    ├─ graphcool.yml
    └─ package.json
    
    To get started, cd into the new directory:
      cd myserver
    
    To deploy your Graphcool service:
      graphcool deploy
    
    To start your local Graphcool cluster:
      graphcool local up
    
    To add facebook authentication to your service:
      graphcool add-template auth/facebook
    
    You can find further instructions in the graphcool.yml file,
    which is the central project configuration.
    
  • Add a Post type definition. (step 4 in quickstart) Edit myserver/types.graphql:
    type Post @model {
      id: ID! @isUnique    # read-only (managed by Graphcool)
      createdAt: DateTime! # read-only (managed by Graphcool)
      updatedAt: DateTime! # read-only (managed by Graphcool)
    
      description: String!
      imageUrl: String!
    }
  • Deploy the Graphcool server (step 5 in quickstart). After running the deploy command, it will ask to select a cluster, select a name, and create an account with Graphcool.
    $ cd myserver
    $ graphcool-framework deploy
    ? Please choose the cluster you want to deploy to
    
    shared-eu-west-1
    
    Auth URL: https://console.graph.cool/cli/auth?cliToken=xxxxxxxxxxxxxxxxxxxxxxxxx&authTrigger;=auth
    Authenticating... ✔
    
    Creating service myserver in cluster shared-eu-west-1... ✔
    Bundling functions... 2.3s
    Deploying... 1.3s
    
    Success! Created the following service:
    
    Types
    
      Post
       + A new type with the name `Post` is created.
       ├─ +  A new field with the name `createdAt` and type `DateTime!` is created.
       ├─ +  A new field with the name `updatedAt` and type `DateTime!` is created.
       ├─ +  A new field with the name `description` and type `String!` is created.
       └─ +  A new field with the name `imageUrl` and type `String!` is created.
    
    Resolver Functions
    
      hello
       + A new resolver function with the name `hello` is created.
    
    Permissions
    
      Wildcard Permission
       ? The wildcard permission for all operations is added.
    
    Here are your GraphQL Endpoints:
    
      Simple API:        https://api.graph.cool/simple/v1/cjc2uk4kx0vzo01603rkov391
      Relay API:         https://api.graph.cool/relay/v1/cjc2uk4kx0vzo01603rkov391
      Subscriptions API: wss://subscriptions.graph.cool/v1/cjc2uk4kx0vzo01603rkov391 
    
  • Run some queries in the Graphcool playground (step 6 in quickstart). Run the following command to open a new browser tab with the Graphcool playground:
    $ graphcool-framework playground 
    
    Run a query to create a post:
    mutation {
      createPost(
        description: "A rare look into the Graphcool office"
        imageUrl: "https://media2.giphy.com/media/xGWD6oKGmkp6E/200_s.gif"
      ) {
        id
      }
    }
    Run a query to retrieve all posts:
    query {
      allPosts {
        id
        description
        imageUrl
      }
    }
  • Done with Graphcool server. Also see https://console.graph.cool/server/schema/types

Next.js React frontend

Set up a React frontend using the Next.js framework. Next.js setup: https://github.com/zeit/next.js#setup

  • Create a myclient directory alongside the myserver directory:
    $ cd /tmp/graphcool-apollo-example
    $ mkdir myclient
    $ cd myclient 
    
  • Create myclient/package.json:
    {
      "dependencies": {
        "next": "4.2.1",
        "react": "16.2.0",
        "react-dom": "16.2.0"
      },
      "scripts": {
        "dev": "next",
        "build": "next build",
        "start": "next start"
      }
    }
  • Install Next.js and React:
    $ npm install
    npm WARN deprecated npmconf@2.1.2: this package has been reintegrated into npm and is now out of date with respect to npm
    npm WARN deprecated @semantic-release/last-release-npm@2.0.2: Use @semantic-release/npm instead
    
    > fsevents@1.1.3 install /private/tmp/graphcool-apollo-example/myclient/node_modules/fsevents
    > node install
    
    [fsevents] Success: "/private/tmp/graphcool-apollo-example/myclient/node_modules/fsevents/lib/binding/Release/node-v57-darwin-x64/fse.node" already installed
    Pass --update-binary to reinstall or --build-from-source to recompile
    
    > uglifyjs-webpack-plugin@0.4.6 postinstall /private/tmp/graphcool-apollo-example/myclient/node_modules/uglifyjs-webpack-plugin
    > node lib/post_install.js
    
    npm notice created a lockfile as package-lock.json. You should commit this file.
    npm WARN myclient No description
    npm WARN myclient No repository field.
    npm WARN myclient No license field.
    
    + react-dom@16.2.0
    + react@16.2.0
    + next@4.2.1
    added 838 packages in 20.012s
    
  • Create a Hello World page
    $ mkdir pages
    
    Create myclient/pages/index.js:
    const Home = () => <div>Hello World</div>;
    
    export default Home;
    
  • Run the Next.js dev server
    $ npm run dev 
    
    Go to http://localhost:3000 in the browser

Apollo Client

Set up Apollo Client to query the Graphcool server. Apollo Client setup: https://www.apollographql.com/docs/react/basics/setup.html

  • Install Apollo Client
    $ cd /tmp/graphcool-apollo-example/myclient 
    
    $ npm install apollo-client-preset react-apollo graphql-tag graphql 
    npm WARN apollo-link-http@1.3.2 requires a peer of graphql@^0.11.0 but none is installed. You must install peer dependencies yourself.
    npm WARN myclient No description
    npm WARN myclient No repository field.
    npm WARN myclient No license field.
    
    + react-apollo@2.0.4
    + graphql-tag@2.6.1
    + apollo-client-preset@1.0.6
    + graphql@0.12.3
    added 20 packages in 5.476s
    
  • Set up Apollo Client. Edit myclient/pages/index.js:
    import { InMemoryCache } from "apollo-cache-inmemory";
    import { ApolloClient } from "apollo-client";
    import { HttpLink } from "apollo-link-http";
    import { ApolloProvider } from "react-apollo";
    
    const client = new ApolloClient({
      link: new HttpLink({
        // Replace this with your Graphcool server URL
        uri: "https://api.graph.cool/simple/v1/cjc2uk4kx0vzo01603rkov391",
      }),
      cache: new InMemoryCache(),
    });
    
    const Home = () => <div>Hello World</div>;
    
    const App = () => (
      <ApolloProvider client={client}>
        <Home />
      </ApolloProvider>
    );
    
    export default App;
    
  • Try running the dev server
    $ npm run dev 
    
    Go to http://localhost:3000 in the browser
  • But, get this error:
    Error: fetch is not found globally and no fetcher passed, to fix pass a fetch for
          your environment like https://www.npmjs.com/package/node-fetch.
    
          For example:
            import fetch from 'node-fetch';
            import { createHttpLink } from 'apollo-link-http';
    
            const link = createHttpLink({ uri: '/graphql', fetch: fetch });
    
        at warnIfNoFetch (/private/tmp/graphcool-apollo-example/myclient/node_modules/apollo-link-http/lib/httpLink.js:72:15)
        at createHttpLink (/private/tmp/graphcool-apollo-example/myclient/node_modules/apollo-link-http/lib/httpLink.js:89:5)
        at new HttpLink (/private/tmp/graphcool-apollo-example/myclient/node_modules/apollo-link-http/lib/httpLink.js:159:34)
        at Object. (/private/tmp/graphcool-apollo-example/myclient/.next/dist/pages/index.js:25:9)
          at Module._compile (module.js:573:30)
          at Module._compile (/private/tmp/graphcool-apollo-example/myclient/node_modules/source-map-support/source-map-support.js:492:25)
          at Object.Module._extensions..js (module.js:584:10)
          at Module.load (module.js:507:32)
          at tryModuleLoad (module.js:470:12)
          at Function.Module._load (module.js:462:3)
  • Install node-fetch. This is needed because Next.js runs on a Node server in addition to the browser.
    $ npm install node-fetch
    npm WARN apollo-link-http@1.3.2 requires a peer of graphql@^0.11.0 but none is installed. You must install peer dependencies yourself.
    npm WARN myclient No description
    npm WARN myclient No repository field.
    npm WARN myclient No license field.
    
    + node-fetch@1.7.3
    updated 1 package in 4.028s
    
  • Update the code to use node-fetch as described in the error message:
    import { InMemoryCache } from "apollo-cache-inmemory";
    import { ApolloClient } from "apollo-client";
    import { createHttpLink } from "apollo-link-http";
    import gql from "graphql-tag";
    import fetch from "node-fetch";
    import { ApolloProvider } from "react-apollo";
    
    const client = new ApolloClient({
      link: createHttpLink({
        // Replace this with your Graphcool server URL
        uri: "https://api.graph.cool/simple/v1/cjc2uk4kx0vzo01603rkov391",
        fetch: fetch,
      }),
      cache: new InMemoryCache(),
    });
    
    class Home extends React.Component {
      componentDidMount() {
        client
          .query({
            query: gql`
              {
                allPosts {
                  id
                  description
                  imageUrl
                }
              }
            `,
          })
          .then(console.log);
      }
    
      render() {
        return <div>Look in the devtools console</div>;
      }
    }
    
    const App = () => (
      <ApolloProvider client={client}>
        <Home />
      </ApolloProvider>
    );
    
    export default App;
    
  • Try running the dev server again
    $ npm run dev 
    
    Go to http://localhost:3000 in the browser
  • It works. Open the browser devtools console and see the result of the query:
    {
      "data": {
        "allPosts": [
          {
            "id": "cjbffxjq5rrvd0192qmptpm2f",
            "description": "A rare look into the Graphcool office",
            "imageUrl": "https://media2.giphy.com/media/xGWD6oKGmkp6E/200_s.gif",
            "__typename": "Post"
          }
        ]
      },
      "loading": false,
      "networkStatus": 7,
      "stale": false
    }
  • Use the graphql higher-order component to make things nicer. Edit myclient/pages/index.js:
    import { InMemoryCache } from "apollo-cache-inmemory";
    import { ApolloClient } from "apollo-client";
    import { createHttpLink } from "apollo-link-http";
    import gql from "graphql-tag";
    import fetch from "node-fetch";
    import { ApolloProvider, graphql } from "react-apollo";
    
    const client = new ApolloClient({
      link: createHttpLink({
        // Replace this with your Graphcool server URL
        uri: "https://api.graph.cool/simple/v1/cjc2uk4kx0vzo01603rkov391",
        fetch: fetch,
      }),
      cache: new InMemoryCache(),
    });
    
    const MY_QUERY = gql`
      {
        allPosts {
          id
          description
          imageUrl
        }
      }
    `;
    
    const Home = ({ data }) => <pre>{JSON.stringify(data, null, 2)}</pre>;
    
    const HomeWithData = graphql(MY_QUERY)(Home);
    
    const App = () => (
      <ApolloProvider client={client}>
        <HomeWithData />
      </ApolloProvider>
    );
    
    export default App;
    
  • Run the dev server
    $ npm run dev 
    
    Go to http://localhost:3000 in the browser and see this result on the page:
    {
      "variables": {},
      "loading": false,
      "networkStatus": 7,
      "allPosts": [
        {
          "id": "cjbffxjq5rrvd0192qmptpm2f",
          "description": "A rare look into the Graphcool office",
          "imageUrl": "https://media2.giphy.com/media/xGWD6oKGmkp6E/200_s.gif",
          "__typename": "Post"
        }
      ]
    }

Docker cheat sheet

An image is a read-only template with instructions for creating a Docker container.
A Dockerfile is a text document that contains all the commands a user could call on the command line to assemble an image.
A container is a runnable instance of an image. You can create, start, stop, move, or delete a container. A container is a process which runs on a host. ...the container process that runs is isolated in that it has its own file system, its own networking, and its own isolated process tree separate from the host.

Listing

Removing

Pulling images

Publishing images

See also: Get started with Docker - Share you image

Building images from Dockerfiles

Creating containers

Starting / stopping containers

Running containers

docker run is a combination of (optionally) docker pull, docker create, and docker start. See also Docker run reference.

volumes
ports & networking

Interacting with containers

Getting information

* docker-compose commands are shaded in gray. They assume a docker-compose.yml file in the current directory.

See also


  1. Reference: Cloning Docker data volumes. See also: Docker volumes guide: Backup, restore, or migrate data volumes. [back]

How to map Caps Lock to Escape when tapped and Control when held on Mac OS Sierra

Escape and Control are useful keys when using Vim so it's nice to map them to a more convenient key like Caps Lock. I had been using Karabiner to do this, but Karabiner doesn't work on Mac OS Sierra. Fortunately Karabiner-Elements provides a subset of the features planned for the next generation Karabiner including remapping Caps Lock to Escape when tapped and Control when held down. The solution below is from @zeekay on issue #8. I am using Karabiner-Elements 0.91.12 and macOS Sierra 10.12.4.

  • Uninstall Seil and Karabiner, if previously installed
  • Install Karabiner-Elements
    $ brew cask install karabiner-elements 
    
  • Edit ~/.config/karabiner/karabiner.json to be:
    {
        "global": {
            "check_for_updates_on_startup": true,
            "show_in_menu_bar": true,
            "show_profile_name_in_menu_bar": false
        },
        "profiles": [
            {
                "complex_modifications": {
                    "parameters": {
                        "basic.to_if_alone_timeout_milliseconds": 250
                    },
                    "rules": [
                        {
                            "manipulators": [
                                {
                                    "description": "Change caps_lock to control when used as modifier, escape when used alone",
                                    "from": {
                                        "key_code": "caps_lock",
                                        "modifiers": {
                                            "optional": [
                                                "any"
                                            ]
                                        }
                                    },
                                    "to": [
                                        {
                                            "key_code": "left_control"
                                        }
                                    ],
                                    "to_if_alone": [
                                        {
                                            "key_code": "escape",
                                            "modifiers": {
                                                "optional": [
                                                    "any"
                                                ]
                                            }
                                        }
                                    ],
                                    "type": "basic"
                                }
                            ]
                        }
                    ]
                },
                "devices": [],
                "fn_function_keys": {
                    "f1": "display_brightness_decrement",
                    "f10": "mute",
                    "f11": "volume_decrement",
                    "f12": "volume_increment",
                    "f2": "display_brightness_increment",
                    "f3": "mission_control",
                    "f4": "launchpad",
                    "f5": "illumination_decrement",
                    "f6": "illumination_increment",
                    "f7": "rewind",
                    "f8": "play_or_pause",
                    "f9": "fastforward"
                },
                "name": "Default profile",
                "selected": true,
                "virtual_hid_keyboard": {
                    "caps_lock_delay_milliseconds": 0,
                    "keyboard_type": "ansi"
                }
            }
        ]
    }

Alternative

@kbussell created a script to do the same using Hammerspoon1 instead of Karabiner.


  1. Hammerspoon is awesome.

postgres on Mac OS (homebrew) notes

Here are my notes on running postgresql on macOS Mojave installed with Homebrew. I also wrote some notes on running postgres in Docker here.

install postgres

$ brew install postgresql

start postgres

$ brew services start postgresql

restart postgres

$ brew services restart postgresql

postgres version

$ postgres --version
postgres (PostgreSQL) 11.1

check postgres is running (option 1)

$ brew services list
Name       Status  User      Plist
postgresql started eliot /Users/eliot/Library/LaunchAgents/homebrew.mxcl.postgresql.plist

check postgres is running (option 2)

$ ps -ef | grep postgres
  502   629     1   0 10Jan19 ??         0:12.39 /usr/local/opt/postgresql/bin/postgres -D /usr/local/var/postgres
  502   715   629   0 10Jan19 ??         0:00.29 postgres: checkpointer
  502   716   629   0 10Jan19 ??         0:02.39 postgres: background writer
  502   717   629   0 10Jan19 ??         0:02.32 postgres: walwriter
  502   718   629   0 10Jan19 ??         0:09.99 postgres: autovacuum launcher
  502   719   629   0 10Jan19 ??         0:45.17 postgres: stats collector
  502   720   629   0 10Jan19 ??         0:00.22 postgres: logical replication launcher
  502 73227 19287   0  8:43PM ttys003    0:00.00 grep postgres

postgres log file

/usr/local/var/log/postgres.log

postgres data directory

/usr/local/var/postgres

psql commands

Start psql:
$ psql postgres
Help with psql commands:
postgres=# \?
List databases:
postgres=# \l
Connect to a database:
postgres=# \c mydatabase
List tables:
mydatabase=# \d