Javascript DataTransfer items not persisting through async calls

Seems like context of DataTransfer is missing with time. My solution is to copy required data before missing and reuse it when needed:

const files = [...e.dataTransfer.items].map(item => item.getAsFile());

Modified code from jsfiddle of @Brad with my solution:

const dropZone = document.querySelector(".dropZone");
const sendFile = file => {
  const formData = new FormData();
  for (const name in file) {
    formData.append(name, file[name]);
  }
  /**
   * https://docs.postman-echo.com/ - postman mock server
   * https://cors-anywhere.herokuapp.com/ - CORS proxy server
   **/
  return fetch(
    "https://cors-anywhere.herokuapp.com/https://postman-echo.com/post",
    {
      method: "POST",
      body: formData
    }
  );
};

dropZone.addEventListener("dragover", e => {
  e.preventDefault();
});

dropZone.addEventListener("drop", async e => {
  e.preventDefault();
  const files = [...e.dataTransfer.items].map(item => item.getAsFile());
  const responses = [];

  for (const file of files) {
    const res = await sendFile(file);
    responses.push(res);
  }
  console.log(responses);
});
body {
  font-family: sans-serif;
}

.dropZone {
  display: inline-flex;
  background: #3498db;
  color: #ecf0f1;
  border: 0.3em dashed #ecf0f1;
  border-radius: 0.3em;
  padding: 5em;
  font-size: 1.2em;
}
<div class="dropZone">
  Drop Zone
</div>

I ran into this problem, and was looking to persist the entire DataTransfer object, not just the items or types, because my asynchronous code's API consumes the DataTransfer type itself. What I ended up doing is creating a new DataTransfer(), and effectively copying over the original's properties (except the drag image).

Here's the gist (in TypeScript): https://gist.github.com/mitchellirvin/261d82bbf09d5fdee41715fa2622d4a6

// https://developer.mozilla.org/en-US/docs/Web/API/DataTransferItem/kind
enum DataTransferItemKind {
  FILE = "file",
  STRING = "string",
}

/**
 * Returns a properly deep-cloned object of type DataTransfer. This is necessary because dataTransfer items are lost
 * in asynchronous calls. See https://stackoverflow.com/questions/55658851/javascript-datatransfer-items-not-persisting-through-async-calls
 * for more details.
 * 
 * @param original the DataTransfer to deep clone
 */
export function cloneDataTransfer(original: DataTransfer): DataTransfer {
  const cloned = new DataTransfer();
  cloned.dropEffect = original.dropEffect;
  cloned.effectAllowed = original.effectAllowed;

  const originalItems = original.items;
  let i = 0;
  let originalItem = originalItems[i];
  while (originalItem != null) {
    switch (originalItem.kind) {
      case DataTransferItemKind.FILE:
        const file = originalItem.getAsFile();
        if (file != null) {
          cloned.items.add(file);
        }
        break;
      case DataTransferItemKind.STRING:
        cloned.setData(originalItem.type, original.getData(originalItem.type));
        break;
      default:
        console.error("Unrecognized DataTransferItem.kind: ", originalItem.kind);
        break;
    }

    i++;
    originalItem = originalItems[i];
  }
  return cloned;
}

You can consume this like so, and then use clone in the same way you originally planned to use evt.dataTransfer:

const clone = cloneDataTransfer(evt.dataTransfer);


Once you call await you're no longer in the original call stack of the function. This is something that would matter particularly in the event listener.

We can reproduce the same effect with setTimeout:

dropZone.addEventListener('drop', async (e) => {
  e.preventDefault();
  console.log(e.dataTransfer.items);
  setTimeout(()=> {
    console.log(e.dataTransfer.items);
  })
});

For example, dragging four files will output:

DataTransferItemList {0: DataTransferItem, 1: DataTransferItem, 2: DataTransferItem, 3: DataTransferItem, length: 4}  
DataTransferItemList {length: 0}

After the event had happened the state has changed and items have been lost.

There are two ways to handle this issue:

  • Copy items and iterate over them
  • Push async jobs(Promises) into the array and handle them later with Promise.all

The second solution is more intuitive than using await in the loop. Also, consider parallel connections are limited. With an array you can create chunks to limit simultaneous uploads.

function pointlessDelay() {
  return new Promise((resolve, reject) => {
    setTimeout(resolve, 1000);
  });
}

const dropZone = document.querySelector('.dropZone');

dropZone.addEventListener('dragover', (e) => {
  e.preventDefault();
});

dropZone.addEventListener('drop', async (e) => {
  e.preventDefault();
  console.log(e.dataTransfer.items);
  const queue = [];
  
  for (const item of e.dataTransfer.items) {
    console.log('next loop');
    const entry = item.webkitGetAsEntry();
    console.log({item, entry});
    queue.push(pointlessDelay().then(x=> console.log(`${entry.name} uploaded`)));
  }
  
  await Promise.all(queue);
});
body {
  font-family: sans-serif;
}

.dropZone {
  display: inline-flex;
  background: #3498db;
  color: #ecf0f1;
  border: 0.3em dashed #ecf0f1;
  border-radius: 0.3em;
  padding: 5em;
  font-size: 1.2em;
}
<div class="dropZone">
  Drop Zone
</div>