Synchronous jobs limited to one at a time?

Hello, I wanted to clarify some behavior that I am seeing and determine whether this behavior is intended, a bug, or possibly due to user error.

I understand based on the synchronous vs. asynchronous calls documentation that placing a synchronous C-MOVE request will keep the API request open until the C-MOVE is complete. However, I do not see any documented limitation on running multiple C-MOVEs in parallel using synchronous mode.

Testing this, I have run the following command to effectively start 8 C-MOVEs in parallel:

for i in {0..8}; do
  curl http://localhost:8042/queries/253e37db-2ec9-41e1-9bc9-eff88fe9481c/answers/$i/retrieve \
    --user orthanc:orthanc \
    -X POST \
    -d '{ "Asynchronous": false, "Priority": '$i', "Simplify": true }' &
done

Monitoring the /jobs?expand endpoint I see 1 job show up as Running at a time and I never see any jobs as Pending. (Below is what I see from a small monitoring script I put together.)

20260105T194023.428086 20260105T194024.344448 df47d DicomMoveScu 5  Success
20260105T194024.344913 20260105T194025.704196 e9092 DicomMoveScu 4  Success
20260105T194025.704937 null 87f82 DicomMoveScu 3  Running
---------------
20260105T194024.344913 20260105T194025.704196 e9092 DicomMoveScu 4  Success
20260105T194025.704937 20260105T194028.495471 87f82 DicomMoveScu 3  Success
20260105T194028.495811 null d4d0e DicomMoveScu 6  Running
---------------
20260105T194025.704937 20260105T194028.495471 87f82 DicomMoveScu 3  Success
20260105T194028.495811 20260105T194034.009766 d4d0e DicomMoveScu 6  Success
20260105T194034.010158 null 9d7d1 DicomMoveScu 7  Running
---------------
20260105T194025.704937 20260105T194028.495471 87f82 DicomMoveScu 3  Success
20260105T194028.495811 20260105T194034.009766 d4d0e DicomMoveScu 6  Success
20260105T194034.010158 20260105T194036.658841 9d7d1 DicomMoveScu 7  Success

When I run the same command with asynchronous mode I immediately see all the jobs show up with N Running and the rest Pending (where N is my concurrent jobs limit.):

20260105T194112.761449 null 4cf7b DicomMoveScu 0  Running
20260105T194112.775612 null 5c8c1 DicomMoveScu 2  Running
20260105T194112.775968 null 5c5fd DicomMoveScu 1  Running
20260105T194112.777045 null f62e0 DicomMoveScu 8  Running
20260105T194112.782005 null b3833 DicomMoveScu 4  Pending
20260105T194112.782476 null 0ae0f DicomMoveScu 5  Pending
20260105T194112.789433 null 38b5e DicomMoveScu 7  Pending
20260105T194112.790756 null c6636 DicomMoveScu 6  Pending
20260105T194112.791152 null 072ee DicomMoveScu 3  Pending
---------------
20260105T194112.761449 null 4cf7b DicomMoveScu 0  Running
20260105T194112.775612 null 5c8c1 DicomMoveScu 2  Running
20260105T194112.775968 null 5c5fd DicomMoveScu 1  Running
20260105T194112.777045 20260105T194113.607130 f62e0 DicomMoveScu 8  Success
20260105T194112.782005 null b3833 DicomMoveScu 4  Pending
20260105T194112.782476 null 0ae0f DicomMoveScu 5  Pending
20260105T194112.789433 null 38b5e DicomMoveScu 7  Running
20260105T194112.790756 null c6636 DicomMoveScu 6  Pending
20260105T194112.791152 null 072ee DicomMoveScu 3  Pending
---------------
20260105T194112.761449 null 4cf7b DicomMoveScu 0  Running
20260105T194112.775612 null 5c8c1 DicomMoveScu 2  Running
20260105T194112.775968 20260105T194113.936231 5c5fd DicomMoveScu 1  Success
20260105T194112.777045 20260105T194113.607130 f62e0 DicomMoveScu 8  Success
20260105T194112.782005 null b3833 DicomMoveScu 4  Pending
20260105T194112.782476 null 0ae0f DicomMoveScu 5  Pending
20260105T194112.789433 null 38b5e DicomMoveScu 7  Running
20260105T194112.790756 null c6636 DicomMoveScu 6  Running
20260105T194112.791152 null 072ee DicomMoveScu 3  Pending
---------------
20260105T194112.761449 null 4cf7b DicomMoveScu 0  Running
20260105T194112.775612 20260105T194115.758546 5c8c1 DicomMoveScu 2  Success
20260105T194112.775968 20260105T194113.936231 5c5fd DicomMoveScu 1  Success
20260105T194112.777045 20260105T194113.607130 f62e0 DicomMoveScu 8  Success
20260105T194112.782005 null b3833 DicomMoveScu 4  Pending
20260105T194112.782476 null 0ae0f DicomMoveScu 5  Running
20260105T194112.789433 null 38b5e DicomMoveScu 7  Running
20260105T194112.790756 null c6636 DicomMoveScu 6  Running
20260105T194112.791152 null 072ee DicomMoveScu 3  Pending
---------------
20260105T194112.761449 20260105T194115.978053 4cf7b DicomMoveScu 0  Success
20260105T194112.775612 20260105T194115.758546 5c8c1 DicomMoveScu 2  Success
20260105T194112.775968 20260105T194113.936231 5c5fd DicomMoveScu 1  Success
20260105T194112.777045 20260105T194113.607130 f62e0 DicomMoveScu 8  Success
20260105T194112.782005 null b3833 DicomMoveScu 4  Running
20260105T194112.782476 null 0ae0f DicomMoveScu 5  Running
20260105T194112.789433 null 38b5e DicomMoveScu 7  Running
20260105T194112.790756 null c6636 DicomMoveScu 6  Running
20260105T194112.791152 null 072ee DicomMoveScu 3  Pending
---------------
20260105T194112.761449 20260105T194115.978053 4cf7b DicomMoveScu 0  Success
20260105T194112.775612 20260105T194115.758546 5c8c1 DicomMoveScu 2  Success
20260105T194112.775968 20260105T194113.936231 5c5fd DicomMoveScu 1  Success
20260105T194112.777045 20260105T194113.607130 f62e0 DicomMoveScu 8  Success
20260105T194112.782005 null b3833 DicomMoveScu 4  Running
20260105T194112.782476 20260105T194117.188067 0ae0f DicomMoveScu 5  Success
20260105T194112.789433 null 38b5e DicomMoveScu 7  Running
20260105T194112.790756 null c6636 DicomMoveScu 6  Running
20260105T194112.791152 null 072ee DicomMoveScu 3  Running
---------------
20260105T194112.761449 20260105T194115.978053 4cf7b DicomMoveScu 0  Success
20260105T194112.775612 20260105T194115.758546 5c8c1 DicomMoveScu 2  Success
20260105T194112.775968 20260105T194113.936231 5c5fd DicomMoveScu 1  Success
20260105T194112.777045 20260105T194113.607130 f62e0 DicomMoveScu 8  Success
20260105T194112.782005 20260105T194118.310859 b3833 DicomMoveScu 4  Success
20260105T194112.782476 20260105T194117.188067 0ae0f DicomMoveScu 5  Success
20260105T194112.789433 null 38b5e DicomMoveScu 7  Running
20260105T194112.790756 null c6636 DicomMoveScu 6  Running
20260105T194112.791152 null 072ee DicomMoveScu 3  Running
---------------
20260105T194112.761449 20260105T194115.978053 4cf7b DicomMoveScu 0  Success
20260105T194112.775612 20260105T194115.758546 5c8c1 DicomMoveScu 2  Success
20260105T194112.775968 20260105T194113.936231 5c5fd DicomMoveScu 1  Success
20260105T194112.777045 20260105T194113.607130 f62e0 DicomMoveScu 8  Success
20260105T194112.782005 20260105T194118.310859 b3833 DicomMoveScu 4  Success
20260105T194112.782476 20260105T194117.188067 0ae0f DicomMoveScu 5  Success
20260105T194112.789433 20260105T194119.088919 38b5e DicomMoveScu 7  Success
20260105T194112.790756 null c6636 DicomMoveScu 6  Running
20260105T194112.791152 null 072ee DicomMoveScu 3  Running
---------------
20260105T194112.761449 20260105T194115.978053 4cf7b DicomMoveScu 0  Success
20260105T194112.775612 20260105T194115.758546 5c8c1 DicomMoveScu 2  Success
20260105T194112.775968 20260105T194113.936231 5c5fd DicomMoveScu 1  Success
20260105T194112.777045 20260105T194113.607130 f62e0 DicomMoveScu 8  Success
20260105T194112.782005 20260105T194118.310859 b3833 DicomMoveScu 4  Success
20260105T194112.782476 20260105T194117.188067 0ae0f DicomMoveScu 5  Success
20260105T194112.789433 20260105T194119.088919 38b5e DicomMoveScu 7  Success
20260105T194112.790756 null c6636 DicomMoveScu 6  Running
20260105T194112.791152 20260105T194121.424691 072ee DicomMoveScu 3  Success
---------------
20260105T194112.761449 20260105T194115.978053 4cf7b DicomMoveScu 0  Success
20260105T194112.775612 20260105T194115.758546 5c8c1 DicomMoveScu 2  Success
20260105T194112.775968 20260105T194113.936231 5c5fd DicomMoveScu 1  Success
20260105T194112.777045 20260105T194113.607130 f62e0 DicomMoveScu 8  Success
20260105T194112.782005 20260105T194118.310859 b3833 DicomMoveScu 4  Success
20260105T194112.782476 20260105T194117.188067 0ae0f DicomMoveScu 5  Success
20260105T194112.789433 20260105T194119.088919 38b5e DicomMoveScu 7  Success
20260105T194112.790756 20260105T194121.682706 c6636 DicomMoveScu 6  Success
20260105T194112.791152 20260105T194121.424691 072ee DicomMoveScu 3  Success

I also monitored the metrics and when in synchronous mode, I do see that all the API requests are active, so I believe the requests are going through and trying to create a job.

I’m not very familiar with C++ but I suspect this happens because void JobsRegistry::SubmitAndWait() locks the jobs engine mutex while waiting for the job to complete.:

  void JobsRegistry::SubmitAndWait(Json::Value& successContent,
                                   IJob* job,        // Takes ownership
                                   int priority)
  {
    std::string id;
    Submit(id, job, priority);

    JobState state = JobState_Pending;  // Dummy initialization

    {
      boost::mutex::scoped_lock lock(mutex_); // Lock occurs here

      for (;;)
      {
        if (!GetStateInternal(state, id))
...

However, new jobs that are submitted must wait for that lock to be released because void JobsRegistry::SubmitInternal() requires it before creating the new job.

  void JobsRegistry::SubmitInternal(std::string& id,
                                    JobHandler* handler)
  {
    if (handler == NULL)
    {
      throw OrthancException(ErrorCode_NullPointer);
    }

    std::unique_ptr<JobHandler>  protection(handler);

    {
      boost::mutex::scoped_lock lock(mutex_); // Lock required here
      CheckInvariants();

      id = handler->GetId();
      int priority = handler->GetPriority();

      jobsIndex_.insert(std::make_pair(id, protection.release()));

      switch (handler->GetState())
...

My question is whether this is intended/expected or if there is any way to run concurrent jobs in synchronous mode. If not, I believe that would be a good addition to the docs. They do say that asynchronous mode is preferred but the reason given is primarily network timeouts.

Thanks for your time and as always for your work on Orthanc.

Actually, I have an update.

When running this using the /modalities/{id}/move endpoint I do not see this behavior, even in synchronous mode. I haven’t dug into this deeply but it seems to be limited to the queries/{id}/answers/{index}/retrieve endpoint.

Hi @josh.keller

All threads were actually blocked here:

The next 7 threads are actually locked here:

So it seems the culprit was actually the QueryAccessor object that was actually locking all the HTTP threads even before they could create their job.

This is now fixed in this commit (you know about these 2 hours investigations to move a single line at the end :wink: )

Best,

Alain.

Yes, I do know about that. :sweat_smile: Thank you for finding this one!