HJS-FUTURE
Future classes for the Hubrisjs javascript framework.
Some future classes: Executors, Callable, Future, BlockingQueue, LinkedBlockingQueue, FutureTask, QueueingFuture, CompletionService, ExecutorCompletionService, AbstractExecutorService, AsyncTask and QueuedWork.
Npm lib installation
Node:
npm install hjs-future --save
Babel installation
Node:
npm install --save-dev babel-cli babel-plugin-transform-runtime babel-polyfill babel-preset-env babel-runtime
Webpack installation for web usage
Node:
npm install --save-dev babel-loader webpack
Table of Contents
- Usage of executors
- Executor
- Execute an anonymous runnable
- Execute a runnable
- Execution with parameters
- Custom executors
- Serial executor from capacity
- Serial executor from queue
- Execute serial runnables
- Async serial runnables execution
- Active runnable on serial
- Blocking executor from capacity
- Blocking executor from queue
- Blocking runnables
- Blocking async runnables
- Active blocking runnable
- Parallel executor
- Parallel runnables
- Prefilled parallel runnables
- Parallel runnable promise
- Parallel promise result
- Active parallel runnable
- Callable executor factory
- Single executor factory
- Serial executor factory
- Blocking executor factory
- Parallel executor factory
- Task executor factory
- Front task executor factory
- Timed task executor factory
- Delayed task executor factory
- Usage of futures and callables
- Usage of blocking queue
Usage of executors
Executor's are objects that executes submitted tasks. They are interfaces that provides a way of decoupling task submission from the mechanics of how each task will be run.
However, the Executor's does not strictly require that execution be asynchronous. In the simplest case, an executor can run the submitted task immediately.
SerialExecutor's auto execute submitted task
BlockingExecutor's execute submitted task on demand
ParallelExecutor's execute submitted task only when the queue is full
Executors is a factory and utility methods for Executor classes defined in this module.
ExecutorService's are Executor's implementation that provides methods to manage termination and methods that can produce a Future for tracking progress of one or more asynchronous tasks.
Executor
; // basic abstract executorconst E = ;
Execute an anonymous runnable
; const R = { console; return null; } ; const E = ; E;
Execute a runnable
;; const R = { console; return null; } ; const E = ; E;
Execution with parameters
;; const R = { let opt = params0; console; if opt return optdata; return null; } ; const E = ; let data = E; console;
Implements a custom runnable
;; // Naive promise runnable implementationconst R = { let opt = params0; return { if opt && optdata === "data to compute" ; else ; }; } ; const E = ; // this return a promise that is resolvedE ;
Custom executors
; const R = { let result = params0 + " world"; return result; } ; // Naive promise executor implementationconst E = { return { if r ; else ; }; } ; E ;
Serial executor from capacity
; const capacity = 100; const S = capacity /*max runnable in the queue (default to 10)*/;
Serial executor from queue
;; const CAPACITY = 10; const queue = CAPACITY; const S = queue /*AbstractQueue implementation (default to Queue)*/;
Execute serial runnables
; const SE = capacity ; for let i=0; i<capacity; i++ // enqueue runnables (this impementation of the execute method return nothing) SE;; // start queue executionSE;
Async serial runnables execution
; const capacity = 10; const SE = capacity ; // runnables creatorconst createRunnable = { return { // for simplicity put computation on promise return { let index = params0; if index < capacity if index === 4 || index === 7 // async block code can block the queue ; else // sync block code never block the queue ; }; } ;}; for let i=0; i<capacity; i++ // enqueue runnables SE;; // start queue execution and get all results as a promiseSE ;
Active runnable on serial
; const capacity = 10; const promises = capacity; const SE = capacity ; for let i = 0; i < capacity; i++ let p = { let first = SEsize === 0; // enqueue a runnable task SE; // getting the active runnable let active = SE; // edge case start the queue if first SE; ; }; promisesi = p; Promiseallpromises ;
Blocking executor from capacity
; const capacity = 10; const S = capacity /*max runnable in the queue (default to 10)*/;
Blocking executor from queue
;; const CAPACITY = 10; const queue = CAPACITY; const S = queue /*AbstractQueue implementation (default to Queue)*/;
Blocking runnables
; const capacity = 10; const BE = capacity ; for let i=0; i<capacity; i++ // enqueue runnables BE;; // start queue execution and get all results// Be care full result are in reversed order here, because tasks are enqueued from sub tasks// last task become first. console;
Blocking async runnables
;; const capacity = 10; const queue = capacity; const BE = queue ; const createRunnable = { return { // don't forget to execute next task executor; return { let idx = params0; if idx <= capacity if idx === 4 || idx === 7 ; else ; }; } ;}; for let i=0; i < capacity; i++ BE; // start queue execution and get all results as a promiseBE ;
Active blocking runnable
; const capacity = 10; const BE = capacity ; { let actives = ; // enqueue a runnable task for let i = 0; i < capacity; i++ BE; let active = null; whileBE && active = BE actives; // don't forget to execute next task BE; };
Parallel executor
;; const capacity = 10; const queue = ; const PE = capacity /*max runnable in the queue (default to 10)*/ queue /*AbstractQueue implementation (default to LinkedList)*/;
Parallel runnables
; const capacity = 5; const PE = capacity ; for let i=0; i<capacity - 1; i++ PE; // queue not fulllet isFull = PE; if !isFull // when we add the last element the queue is executed PE;
Prefilled parallel runnables
;; const capacity = 5; const tasks = capacity; for let i=0; i<capacity; i++ tasksi = { console; return params0; } ; // fill the queueconst queue = tasks; // later in the codeconst PE = capacity queue ; // here queue is fullPE;
Parallel runnable promise
;; const lorem = "Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua"; const capacity = loremlength; const PE = capacity; const queue = capacity; lorem;
Parallel promise result
; const capacity = 10; const PE = capacity ; const createRunnable = { return { return { let idx = params0; if idx <= capacity if idx === 4 || idx === 7 ; else ; }; } ;}; for let i=0; i < capacity; i++ PE; // start queue execution and get all results as a promisePE ;
Active parallel runnable
; const capacity = 5; const PE = capacity; const actives = ; const runnables = ; for let i = 0; i < capacity; i++ runnablesi = index: i resolve: null reject: null { if thisresolve this; } ; Promiseallrunnables;
Callable executor factory
; // A runnablelet runnable = { console; }; // An optionnal resultlet result = "A default task"; // Create a callable instancelet callable = Executors;
Single executor factory
; let PE = Executors;
Serial executor factory
;; let SE1 = Executors; let SE2 = Executors;
Blocking executor factory
;; let BE1 = Executors; let BE2 = Executors;
Parallel executor factory
;; let PE = Executors;
Task executor factory
;; const capacity = 10; // Create an executor that post tasks on message handlerconst PE = Executors; // execute runnables on the same executorfor let i=0; i<capacity; i++ PE;
Front task executor factory
;; const capacity = 10; // Create an executor that always post at front of the queueconst PE = Executors; for let i=0; i<capacity; i++ PE;
Timed task executor factory
;; const capacity = 10; // Post at time executorconst PE = Executors; for let i=0; i<capacity; i++ let START_TIME = Date; let WAIT_TIME = 200 * i + 1; let UPTIME_MILLIS = START_TIME + WAIT_TIME; PE;
Delayed task executor factory
;; const capacity = 10; // Post at delayed executorconst PE = Executors; for let i=0; i<capacity; i++ let start = Date; let delay = 200 * i + 1; PE;
Usage of futures and callables
Callable's are task that returns a result and may throw an exception. Implementors define a single method called compute.
Future's represents the result of an asynchronous computation. Methods are provided to check if the computation is complete, to wait for its completion, and to retrieve the result of the computation. The result can only be retrieved using method get when the computation has completed, blocking if necessary until it is ready. Cancellation is performed by the cancel method. Additional methods are provided to determine if the task completed normally or was cancelled. Once a computation has completed, the computation cannot be cancelled. The framework give you a future implementation named FutureTask.
Callable
;; // basic example that manually start a callableconst C = /* compute is the only accepted arguments */ { let P = { // sleep 2 seconds timeSECONDS; } ; } ; // listen for computationC; // start computationC;
Future from callable
; /* create a future task with a Callable instance */const F1 = /*An associated callable instance */ callable: { } /* An optional callback */ { } ; /* create a future task with an anonymous callable */const F2 = /*An associated anonymous callable */ callable: { } ; /* create a future task with an anonymous computation */const F23 = /*An associated anonymous computation */ { } ;
Future from runnable
;; /* create a future task with a Runnable instance */const F1 = /*An optional default result*/ result: "my default data" /*An associated runnable instance */ runnable: { } /* An optional callback */ { } ; /* create a future task with an anonymous runnable */const F2 = /*An associated anonymous runnable */ runnable: { } ; /* create a future task with an anonymous computation */const F23 = /*An associated anonymous computation */ { } ;
Future callable task
;; /* create a future task */const F = /*An associated callable*/ callable: { timeMILLISECONDS; } ; /*Execute this task before a timeout of 1 seconds*/F; //somewhere in the code;
Future runnable task
;; const result = "TEST"; const F = /*Optional default result*/ result /*An associated runnable*/ { timeMILLISECONDS; }; /*Execute this task before a timeout of 1 seconds*/F; ;
Cancel a callable task
;; const F = callable: { timeMILLISECONDS; } ; F; ;
Cancel a runnable task
;; const result = "TEST"; const F = result { timeMILLISECONDS; }; F; ;
Cancel a future task
;; /* create a future task */const F = callable: { timeSECONDS; } ; F; //somewhere in the code;
Usage of blocking queue
BlockingQueue's are AbstractQueue that additionally supports operations that wait for the queue to become non-empty when retrieving an element, and wait for space to become available in the queue when storing an element.
BlockingQueue methods come in four forms, with different ways of handling operations that cannot be satisfied immediately, but may be satisfied at some point in the future.
LinkedBlockingQueue is a BlockingQueue implementation based on linked nodes. This queue orders elements FIFO (first-in-first-out). The head of the queue is that element that has been on the queue the longest time. The tail of the queue is that element that has been on the queue the shortest time. New elements are inserted at the tail of the queue, and the queue retrieval operations obtain elements at the head of the queue. Linked queues typically have higher throughput than array-based queues but less predictable performance in most concurrent applications.
The optional capacity bound constructor argument serves as a way to prevent excessive queue expansion. The capacity, if unspecified, is equal to Number.MAX_VALUE. Linked nodes are dynamically created upon each insertion unless this would bring the queue above capacity.
Linked blocking queue
; // bound to 5 elementsconst capacity = 5; // create the queuelet LBQ = capacity ;
Add
; const capacity = 1; let LBQ = capacity ; // add an elementlet added = LBQ; // true if the element was addedconsole;
AddAll
; const collection = data: "Added0" data: "Added1" data: "Added2" data: "Added3"; const capacity = collectionlength; let LBQ = capacity ; // add all elementslet added = LBQ; // true if the queue was modifiedconsole;
Clear
; const collection = data: "Added0" data: "Added1" data: "Added2" data: "Added3"; const capacity = collectionlength; let LBQ = capacity ; if LBQ // clear the queue LBQclear;
Contains
; const collection = data: "Added0" data: "Added1" data: "Added2" data: "Added3"; const capacity = collectionlength; let LBQ = capacity ; // true if the queue was modified && contains the first elementconsole;
Drain
; const collection = data: "Added0" data: "Added1" data: "Added2" data: "Added3"; const capacity = collectionlength; let LBQ = capacity ; // true if the queue was modified && contains the first elementconsole;
Element
; const collection = data: "Added0" data: "Added1" data: "Added2" data: "Added3"; const capacity = collectionlength; let LBQ = capacity ; // true if the queue was modified && contains the first elementconsole;
Offer
;; const capacity = 4; let LBQ = capacity ; console; for let i=0; i<capacity; i++ // offer an element let bool = LBQ; // always true/false console; // here the queue is fullconsole;
Timeout Offer
;; const capacity = 4; let LBQ = capacity ; // overflow 50%const overflow = Math; // buffer sizeconst bufferOverflow = capacity + overflow; // timeoutlet timeout = 1000; console; for let i=0; i<bufferOverflow; i++ // offer an element let bool = LBQ; // the result of the operation (true/false) console; // here the queue is fullconsole;
Peek
; const data = data: "Added0" data: "Added1" data: "Added2" data: "Added3"; const LBQ = data ; // the first elementconsole;
Poll
;; const capacity = 4; let LBQ = capacity ; console; for let i=0; i<capacity; i++ // poll an element let element = LBQ; // null element are waiting async completion console; // the queue is emptyconsole;
Timeout poll
; const capacity = 4; let LBQ = capacity ; // overflow 50%const overflow = Math;// buffer sizeconst bufferOverflow = capacity + overflow;// timeoutlet timeout = 1000; console; for let i=0; i<bufferOverflow; i++ // poll an element let element = LBQ; // null element are waiting async completion console; // the queue is emptyconsole;
Put
; const capacity = 4;const LBQ = capacity ; let item = data: "My secret" ; LBQ; // the first elementconsole;
Remaining capacity
; const capacity = 4;const LBQ = capacity ; let item = data: "My secret" ; LBQ; // the first elementconsole;
Remove
; const capacity = 4;const LBQ = capacity ; let item1 = data: "My secret 1" ;LBQ; let item2 = data: "My secret 2" ;LBQ; if LBQ console;
Size
; const capacity = 4;const LBQ = capacity ; let item = data: "My secret" ; LBQ; // the first elementconsole;
Take
; const capacity = 4;const LBQ = capacity ; let item = data: "My secret" ; LBQ; // the first elementconsole;
Offer and Poll operation (2 way data binding)
; const capacity = 4; const overflow = Math;const bufferOverflow = capacity + overflow;const threshold = Math; const LBQ = capacity ; console; for let i=0; i<bufferOverflow; i++ // offer an element or wait until the next POLL event let full = LBQ; // wakeup operation;
Offer and Poll with delay operation (2 way data binding)
; const capacity = 4; const overflow = Math;const bufferOverflow = capacity + overflow;const threshold = Math;const timeout = 1000; const LBQ = capacity ; console; for let i=0; i<bufferOverflow; i++ // offer an element or wait until the next POLL event let full = LBQ; console; // wakeup before poll operation;
Usage of executor services
AbstractExecutorService's provides an abstract implementation of ExecutorService's execution methods. This class implements the submit, invokeAny and invokeAll methods using a RunnableFuture returned by newTaskFor, which defaults to the FutureTask class provided in this module.
ExecutorCompletionService uses a supplied Executor to execute tasks. This class arranges that submitted tasks are, upon completion, placed on a queue accessible using take. The class is lightweight enough to be suitable for transient use when processing groups of tasks.
PoolExecutorService is an ExecutorService that executes each submitted task using one of possibly several pooled futures.
Executor service pools address two different problems:
- They usually provide improved performance when executing large numbers of asynchronous tasks, due to reduced per-task invocation overhead.
- They provide a means of bounding and managing the resources, including futures, consumed when executing a collection of tasks. Each PoolExecutorService also maintains some basic statistics, such as the number of completed tasks. To be useful across a wide range of contexts, this class provides many adjustable parameters and extensibility hooks.
Create an abstract executor service
; const AES = ;
Submit tasks on an abstract executor service
;;; const capacity = 10; const Q = capacity; const AES = ; for let i=0; i<capacity; i++ // submit a callable and return a future for later execution let future = AES; // add in the queue for later execution if !Q console; let future = null; // later in the codewhilefuture = Q // execute the future task future;
Invoke all tasks on an abstract executor service
;; const capacity = 10; // list of callablesconst callables = capacity; // create an abstract executor service implementationconst AES = ; for let i=0; i<capacity; i++ // fill the list with callables callablesi = { const sleep = Math + 1; // some tasks can't termine the work before the executor service timeout timeSECONDS; } ; // Invoke all the task waiting either if all tasks are terminated before the timeout or notAES;
Contacts
Distributed under the MIT license. See LICENSE
for more information.