ssh: rewrite proxy management for multiprocessing usage
We changed sync to use multiprocessing for parallel work. This broke
the ssh proxy code as it's all based on threads. Rewrite the logic to
be multiprocessing safe.
Now instead of the module acting as a stateful object, callers have to
instantiate a new ProxyManager class that holds all the state, an pass
that down to any users.
Bug: https://crbug.com/gerrit/12389
Change-Id: I4b1af116f7306b91e825d3c56fb4274c9b033562
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/305486
Tested-by: Mike Frysinger <vapier@google.com>
Reviewed-by: Chris Mcdonald <cjmcdonald@google.com>
diff --git a/git_config.py b/git_config.py
index d7fef8c..978f6a5 100644
--- a/git_config.py
+++ b/git_config.py
@@ -27,7 +27,6 @@
from error import GitError, UploadError
import platform_utils
from repo_trace import Trace
-import ssh
from git_command import GitCommand
from git_refs import R_CHANGES, R_HEADS, R_TAGS
@@ -519,17 +518,23 @@
return self.url.replace(longest, longestUrl, 1)
- def PreConnectFetch(self):
+ def PreConnectFetch(self, ssh_proxy):
"""Run any setup for this remote before we connect to it.
In practice, if the remote is using SSH, we'll attempt to create a new
SSH master session to it for reuse across projects.
+ Args:
+ ssh_proxy: The SSH settings for managing master sessions.
+
Returns:
Whether the preconnect phase for this remote was successful.
"""
+ if not ssh_proxy:
+ return True
+
connectionUrl = self._InsteadOf()
- return ssh.preconnect(connectionUrl)
+ return ssh_proxy.preconnect(connectionUrl)
def ReviewUrl(self, userEmail, validate_certs):
if self._review_url is None: