OperationQueue

It’s built on top of Grand Central Dispatch (GCD) and provides a higher-level abstraction for managing concurrent operations.

It’s an abstract class and never used directly. We can make use of the system-defined BlockOperation subclass or by creating your own subclass and start an operation by adding it to an OperationQueue or by manually calling the start method.

The queue automatically manages the execution of operations, executing them in a FIFO (First-In, First-Out) order by default. However, we can change the priority of operations or cancel operations as needed

Operation Queues provide additional features such as built-in support for dependencies and cancelation, making them more suitable for managing complex workflows and operations.

Scenario1:

Imagine you have two tasks: Task A downloads an image from a URL, and Task B processes the downloaded image. Task B should only execute after Task A has completed successfully.

Scenario2:

Suppose the user decides to cancel the image downloading process while it’s in progress.

Explanation:

  • Dependency Creation: By using addDependency( ) method, you establish a dependency relationship between processOperation and downloadOperation. This ensures that processOperation will not start until downloadOperation finishes.
  • Cancellation: By calling the cancel() method on the operation, we can request the operation to cancel its execution. However, it’s important to note that this only sets the isCancelled property of the operation to true. It’s up to the operation to check this property periodically during its execution and abort if cancellation is requested.

Operation Queue offers powerful features for managing dependencies between operations and handling cancellation requests gracefully. These features are particularly useful in scenarios where tasks have complex interdependencies or when the user needs to interact with long-running operations.

GCD vs OperationQueue

GCD:

  • Doesn’t have built-in support for managing dependencies between tasks. We need to manually handle dependencies by using dispatch groups or nesting blocks.
  • Doesn’t have built-in support for cancellation. We need to explicitly check for cancellation within your blocks and return early if needed.

Operation Queue:

  • Offers built-in support for managing dependencies between operations using addDependency(_:) method. This makes it easier to define and manage complex task dependencies.
  • Provides built-in support for cancellation by setting the isCancelled property of operations. Operations can check this property periodically and abort their execution if cancellation is requested.

Global Dispatch Queues

GCD provides a set of global dispatch queues that are managed by the system. These queues are categorised into different quality-of-service (QoS) classes, indicating their priority.

  • userInteractive: Highest priority, used for tasks that must be executed immediately to provide a smooth user experience.
  • userInitiated: High priority, used for tasks initiated by the user that require immediate results.
  • default: Default priority, used for tasks that are not time-critical but should be executed reasonably soon.
  • utility: Low priority, used for long-running tasks that are not time-sensitive, such as file I/O or network requests.
  • background: Lowest priority, used for tasks that can run in the background, such as data synchronisation or prefetching.

Main Dispatch Queue:

The Main Dispatch Queue is a special serial dispatch queue associated with the main thread of your application. It’s the primary queue for updating the user interface and handling events like user interactions.

  1. print("1") is executed synchronously, printing “1” immediately.
  2. DispatchQueue.main.async is called to asynchronously execute the closure on the main queue.
  3. print("5") is executed synchronously after the async call, printing “5”.
  4. Inside the async closure:
    • print("2") is executed synchronously, printing “2”.
    • DispatchQueue.main.async is called again to asynchronously execute the inner closure on the main queue.
    • print("4") is executed synchronously after the async call, printing “4”.
  5. Inside the inner async closure:
    • print("3") is executed synchronously, printing “3”.
  1. print("1") is executed synchronously, printing “1” immediately.
  2. DispatchQueue.main.async is called to asynchronously execute the closure on the main queue.
  3. print("5") is executed synchronously after the async call, printing “5”.
  4. Inside the async closure:
    • print("2") is executed synchronously, printing “2”.
    • DispatchQueue.global().async is called to asynchronously execute the closure on the global queue.
    • print("4") is executed synchronously after the async call, printing “4”.
  5. print("3") is not guaranteed to be executed before “4” because it’s scheduled asynchronously on the global queue, which may take some time to execute.

Explaination:

  1. The outer block (print2) will execute first because it’s dispatched before the inner block (Print1).
  2. The inner block (Print1) will execute after the outer block because it’s nested within it.

Dispatch.main is a serial queue which has single thread to execute all the operations. If we call sync on this queue it will block all other operations currently running on the thread and try to execute the code block inside sync whatever you have written. This results in “deadlock” and app will get crash.

Dispatch Groups

Dispatch groups provide a way to monitor the completion of multiple tasks dispatched asynchronously. They allow you to wait until all tasks in the group have finished executing before continuing with further code.

Grand Central Dispatch (GCD)

Grand Central Dispatch is a low-level API provided by Apple for managing concurrent operations. GCD abstracts away many of the complexities of thread management and provides a simple and efficient way to execute tasks concurrently.

It provides a set of APIs for managing tasks and executing them concurrently on multicore hardware. GCD helps developers to create responsive and scalable applications by offloading time-consuming tasks to background threads, thus keeping the main thread free to handle user interactions.

There are two types of dispatch queues:

  • Serial Queues: Executes tasks one at a time in the order they are added to the queue. Tasks in a serial queue are guaranteed to run sequentially.
  • Concurrent Queues: Can execute multiple tasks concurrently. Tasks may start and finish in any order.

Serial Queues

A serial queue is a type of dispatch queue in Grand Central Dispatch (GCD) that executes tasks in a FIFO (first-in, first-out) order. This means that tasks added to the queue are executed one at a time, in the order in which they were added.

Serial queues are useful when you want to ensure that tasks are executed sequentially, avoiding concurrency issues such as race conditions.

Concurrent Queues

A concurrent queue in Swift allows multiple tasks to be executed concurrently, meaning they can run simultaneously. Unlike serial queues, tasks in a concurrent queue can start and finish in any order.

Concurrency & Multithreading

Thread

A thread is the smallest unit of execution within a process. It represents a single sequence of instructions that can be scheduled and executed independently by the operating system’s scheduler.

Multithreading

The term “multithreading” refers to the use of multiple threads within a single process. Multithreading allows different parts of a program to execute concurrently and share resources such as memory and I/O devices.

Parallelism

Threads enable parallelism by executing multiple tasks simultaneously on multicore processors. By utilizing multiple threads, applications can take advantage of the available CPU cores to improve performance and responsiveness.

Thread Safety

When multiple threads access shared resources concurrently, it’s essential to ensure thread safety to avoid race conditions and data corruption. Synchronization mechanisms such as locks, semaphores, and atomic operations are used to coordinate access to shared resources and prevent conflicts between threads.

Sync

In synchronous programming, tasks are executed sequentially, one after the other. Each task must wait for the previous one to complete before it can start. Synchronous operations block the execution of the program until they are finished, meaning that the program waits for the operation to complete before moving on to the next task.

Async

In asynchronous programming, tasks can be executed concurrently or in parallel, allowing the program to continue executing other tasks while waiting for certain operations to complete. Asynchronous operations do not block the execution of the program. Instead, they execute in the background, and the program can continue performing other tasks while waiting for the asynchronous operation to finish.

Race Condition

 A race condition occurs when two tasks are executed concurrently, when they should be executed sequentially in order to be done correctly. You cant change view constraint while it is being calculated. So UI activity should be done in main thread so it is executed sequentially.

To achieve concurrency in iOS there are 2 build in APIs available

  1. GCD
  2. NSOPERATIONQUEUE