Pip build option to use multicore

From what I can tell it does not look like pip has this ability but I may be mistaken.

To do multiprocessing in python you use the multiprocessing package, [here is a guide I found] (http://pymotw.com/2/multiprocessing/basics.html) about how to do it if you are interested and this is a link to the python docs that talk about it. I also found this question useful, Multiprocessing vs Threading Python, to make sure that multiprocessing did what I thought it did, being take advantage of multiple CPUs.

I have gone through the pip source code (available here) looking for a reference to the multiprocessing package and did not find any use of the package. This would mean that pip does not use/support multiprocessing. From what I can tell the /pip/commands/install.py file is the one of interest for your question as it is called when you run pip install <package>. For this file specifically the imports are

from __future__ import absolute_import

import logging
import os
import tempfile
import shutil
import warnings

from pip.req import InstallRequirement, RequirementSet, parse_requirements
from pip.locations import virtualenv_no_global, distutils_scheme
from pip.basecommand import Command
from pip.index import PackageFinder
from pip.exceptions import (
    InstallationError, CommandError, PreviousBuildDirError,
)
from pip import cmdoptions
from pip.utils.deprecation import RemovedInPip7Warning, RemovedInPip8Warning

which you can see does not have any reference to the multiprocessing package but I did check all of the other files just to be sure.

Furthermore, I checked the pip install documentation and found no reference to installing using multiple cores.

TL;DR: Pip doesn't do what you are asking. I may be wrong as I didn't look at the source that long but I'm pretty sure it just doesn't support it.


Tested this works https://stackoverflow.com/a/57014278/6147756

Single command:

MAKEFLAGS="-j$(nproc)" pip install xxx

Enable for all commands in a script:

export MAKEFLAGS="-j$(nproc)"

The Ultimate Way to Resolve This Problem

Because all the c / cpp files would be compiled by using make commend, and make has an option which specify how many cpu cores shoule be used to compile the source code, we could do some tricks on make.

  1. Backup your original make command:

    sudo cp /usr/bin/make /usr/bin/make.bak

  2. write a "fake" make command, which will append --jobs=6 to its parameter list and pass them to the original make command make.bak:

    make.bak --jobs=6 $@

So after that, not even compile python with c libs, but also others contain c libs would speed up on compilation by 6 cores. Actually all files compiled by using make command will speed up.

And good luck.


Use: --install-option="--jobs=6" (pip docs).

pip3 install --install-option="--jobs=6" PyXXX

I have the same demand that use pip install to speed the compile progress. My target pkg is PySide. At first I use pip3 install pyside, it takes me nearly 30 minutes (AMD 1055T 6-cores, 10G RAM), only one core take 100% load.

There are no clues in pip3 --help, but I found lots of options like pip install -u pyXXX, but I didn't know what is '-u' and this parameter was not in pip --help too. I tried 'pip3 install --help' and there came the answer: --install-option.

I read the code of PySide's code and found another clue: OPTION_JOBS = has_option('jobs'), I put ipdb.set_trace() there and finally understand how to use multicore to compile by using pip install.

it took me about 6 minutes.

--------------------------update------------------------------

as comment below, I finally used tricks like this: cd /usr/bin sudo mv make make.bak touch make then edit make: vim make or other way you like and type this: make.bak --jobs=6 $* I'm not familiar with bash, so I'm not sure if this is the correcct bash code. I'm writing this comment in windows. The key is rename make into make.bak, and then create a new make, use this new make to call make.bak with added param --jobs=6