Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Is there a reason why bulkUpdate doesn't exist? #632

Closed
mcfarljw opened this issue Jan 6, 2018 · 14 comments
Closed

Is there a reason why bulkUpdate doesn't exist? #632

mcfarljw opened this issue Jan 6, 2018 · 14 comments

Comments

@mcfarljw
Copy link

mcfarljw commented Jan 6, 2018

I just want to update several things by key and of course I can just write my own Promise.all function that calls table.update, but is there a specific reason this wasn't included?

@nponiros
Copy link
Collaborator

nponiros commented Jan 6, 2018

Depending on how you update you might be able to use http://dexie.org/docs/Table/Table.bulkPut(). http://dexie.org/docs/Table/Table.update() explains the difference between put and update.

In case you use table.update you should put the calls in a transaction and react when it's done (not when the individual updates are done). If I remember correctly bulkPut does just that with table.put.

No clue why bulkUpdate is not implemented.

@mcfarljw
Copy link
Author

mcfarljw commented Jan 7, 2018

Right, in this particular case I'm just updating one single property and don't want to overwrite the other properties for a few specific keys. Thanks for the heads up about wrapping it all using a transaction!

@dfahlander
Copy link
Collaborator

The only reason bulkUpdate isn't updated has been lack of time ;) The bulk methods are, as @nponiros said, a transaction block with operations, but there is a great performance gain using the bulk methods as they ignore success events from indexedDB, which makes a great difference when working with large arrays of objects.

The plan is to implement Table.bulkUpdate(), wich will be more performant than several individual updates.

@mcfarljw
Copy link
Author

mcfarljw commented Jan 8, 2018

Interesting and thanks for clearing that up! In my case my bulk updates generally include less than 100 things at a time so for now the performance of updating individually seems negligible.

@diegocr
Copy link

diegocr commented May 12, 2020

Hi there, looking for some feedback... what do you guys think about this bulkUpdate implementation, i.e. am i missing something?

MyDexie.prototype = Object.create(Dexie.prototype);

MyDexie.prototype.bulkUpdate = promisify(function(resolve, reject, table, bulkData) {
    'use strict';

    if (typeof table !== 'string') {
        bulkData = table;
        table = this.tables;
        table = table.length === 1 && table[0].name;
    }

    if (!bulkData.length) {
        return resolve(bulkData);
    }
    table = this.table(table);

    var i;
    var keyPath;
    var anyOf = [];
    var schema = table.schema;
    var indexes = schema.indexes;

    for (i = 0; i < indexes.length; ++i) {
        if (indexes[i].unique) {
            keyPath = indexes[i].keyPath;
            break;
        }
    }

    for (i = bulkData.length; i--;) {
        var v = bulkData[i][keyPath || schema.primKey.keyPath];

        if (MyDexie.exists(anyOf, v)) {
            bulkData.splice(i, 1);
        }
        else {
            anyOf.push(v);
        }
    }

    (keyPath ? table.where(keyPath).anyOf(anyOf).toArray() : table.bulkGet(anyOf))
        .then(function(r) {
            var toUpdate = [];

            keyPath = keyPath || schema.primKey.keyPath;
            for (var i = r.length; i--;) {
                for (var j = r[i] && bulkData.length; j--;) {
                    if (MyDexie.equal(r[i][keyPath], bulkData[j][keyPath])) {
                        delete bulkData[j][keyPath];
                        toUpdate.push([r[i], bulkData.splice(j, 1)[0]]);
                        break;
                    }
                }
            }

            var tasks = toUpdate.map(function(u) {
                return table.where(":id").equals(u[0][schema.primKey.keyPath]).modify(u[1]);
            });
            if (bulkData.length) {
                tasks.push(table.bulkPut(bulkData));
            }
            return Promise.all(tasks);
        })
        .then(resolve)
        .catch(reject);
});

Thanks in advance :)

@jonathanadams
Copy link

Has there been any updates on this?

I am also looking for something similar to a bulkUpdate method but something that bulkAdds as well as bulkUpdates. Similar to bulkPut but instead of replacing the object it updates the values.

@dfahlander
Copy link
Collaborator

The implementation of bulkUpdate should do things similarily to how Collection.modify() is implemented but a much simpler as it will only apply a set of keyPaths and change to their new values. Basically do a dbCoreTable.getMany({trans, keys, cache: 'immutable'}).then(values => values.map((value, i) => applyUpdate(value, updates[i])).then(newValues => dbCoreTable.put({trans, keys, values: newValues}). applyUpdate should be done the same way as the modifyer function does it in Collection.modify() - to go through the keypaths and call setByKeyPath() on them. Must take care of both inbound and outbound tables.

I will implement this at some point or get a PR that does it.

@rowild
Copy link

rowild commented Jan 3, 2022

Hi! Prosit New Year! – I was wondering, whether this feature has been implemented already? Thanks for the info! @dfahlander

@dfahlander
Copy link
Collaborator

It's not implemented.

@Toshinaki
Copy link

So is there any sample code showing how to achieve this before bulkUpdate is implemented?

Is this code alright?

const updateMulti = async (data: Array<TodoType>) => {
  await db.transaction("rw", db.todos, async () => {
    await Promise.all(
      data.map(
        async ({ id, ...rest }) => await db.todos.update(id, { ...rest, updatedAt: Date.now() })
      )
    );
  });
  return data.map((d) => d.id);
};

@dfahlander
Copy link
Collaborator

dfahlander commented Oct 3, 2022

So is there any sample code showing how to achieve this before bulkUpdate is implemented?

Is this code alright?

const updateMulti = async (data: Array<TodoType>) => {
  await db.transaction("rw", db.todos, async () => {
    await Promise.all(
      data.map(
        async ({ id, ...rest }) => await db.todos.update(id, { ...rest, updatedAt: Date.now() })
      )
    );
  });
  return data.map((d) => d.id);
};

No, this looks more like it tries to do what bulkPut() already does. bulkPut() is an upsert operation - it will add or replace the given object, while update will only apply changes on individual properties.

There is an implementation of bulkUpdate() in dexie-cloud-addon (not exported though) that you could copy/paste:

import Dexie, { Table, cmp } from 'dexie';

export async function bulkUpdate(
  table: Table,
  keys: any[],
  changeSpecs: { [keyPath: string]: any }[]
) {
  const objs = await table.bulkGet(keys);
  const resultKeys: any[] = [];
  const resultObjs: any[] = [];
  keys.forEach((key, idx) => {
    const obj = objs[idx];
    if (obj) {
      for (const [keyPath, value] of Object.entries(changeSpecs[idx])) {
        if (keyPath === table.schema.primKey.keyPath) {
          if (cmp(value, key) !== 0) {
            throw new Error(`Cannot change primary key`);
          }
        } else {
          Dexie.setByKeyPath(obj, keyPath, value);
        }
      }
      resultKeys.push(key);
      resultObjs.push(obj);
    }
  });
  await (table.schema.primKey.keyPath == null
    ? table.bulkPut(resultObjs, resultKeys)
    : table.bulkPut(resultObjs));
}

EDIT: You'd need dexie@^4.0.0-alpha.1 to import cmp. It's needed if your primary keys can be arrays, dates or typed arrays, but if you know your primary keys to be number or string, you could change it to if (value !== key) to get rid of the dependency of cmp.

@dfahlander
Copy link
Collaborator

bulkUpdate is now in Dexie 4.0 and in the docs: https://dexie.org/docs/Table/Table.bulkUpdate()

@rowild
Copy link

rowild commented Jan 30, 2023

@dfahlander Great, thank you! ill try it ASAP!

@tinh1115
Copy link

I can see it's really fast my bulkUpdate, in my case running 8k records updating via bulkUpdate reduce half of time. Thank you so much.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

8 participants