If you want them to run completely parallel (the threads are limited in CPython by the GIL) you need to create a process for each script.
In the case of two independent scripts you can create a third script and use subprocess.Popen
to create a process for each script:
#!/usr/bin/env python
import subprocess
# Iterable con las rutas de los scripts
scripts_paths = ("C:/Users/test_1/script1.py", "C:/Users/test_2/script2.py")
# Creamos cada proceso
procesos = [subprocess.Popen(["python", script]) for script in scripts_paths]
# Esperamos a que todos los subprocesos terminen.
for proceso in procesos:
proceso.wait()
# Resto de código a ejecutar cuando terminen todos los subprocesos.
If you need or want to check if the processes finished or not correctly you can use a list by compression that contains the exit codes returned by wait
:
exit_codes = [p.wait() for p in p1, p2]
Usually 0 is returned if the program ended successfully, so you can condition the subsequent behavior based on it:
#!/usr/bin/env python
import subprocess
scripts_paths = ("C:/Users/test_1/script1.py", "C:/Users/test_2/script2.py")
ps = [subprocess.Popen(["python", script]) for script in scripts_paths]
exit_codes = [p.wait() for p in ps]
if not any(exit_codes):
print("Todos los procesos terminaroin con éxito")
else:
print("Algunos procesos terminaron de forma inesperada.")
If the scripts require arguments, they can also be passed without problems.
If you need both processes to interact with each other securely you should use multiprocessing and establish secure mechanisms for communication between them.