Concurrency with Actors
Flowa uses the Actor Model for safe concurrent programming with message passing instead of shared memory.
Actor Model Principles
- No Shared State: Actors don't share memory
- Message Passing: Communication via messages
- Isolation: Each actor has its own state
- Lock-Free: No mutexes or synchronization primitives
Actors in Flowa
While full actor syntax is still in development, Flowa's event loop and async capabilities enable concurrent patterns:
// Conceptual actor pattern
func create_worker(id) {
let worker = {
"id": id,
"process": func(data) {
log.info("Worker " + tostring(id) + " processing: " + data);
// Process data
return "result_" + tostring(id);
}
};
return worker;
}
// Create multiple workers
let worker1 = create_worker(1);
let worker2 = create_worker(2);
// Process in parallel (conceptually)
worker1["process"]("task1");
worker2["process"]("task2");Event Loop
Flowa's event loop enables non-blocking operations:
- HTTP server requests run concurrently
- Scheduled tasks execute independently
- I/O operations are non-blocking
Concurrent Patterns
Message Queue
let message_queue = [];
func enqueue_message(message) {
message_queue = push(message_queue, message);
log.debug("Queued: " + message);
}
func process_queue() {
while (len(message_queue) > 0) {
let message = message_queue[0];
message_queue = shift(message_queue);
// Process message
log.info("Processing: " + message);
}
}
// Schedule periodic processing
cron.schedule("*/1 * * * *", process_queue);Worker Pool
func create_worker_pool(size) {
let workers = [];
let i = 0;
while (i < size) {
let worker = create_worker(i);
workers = push(workers, worker);
i = i + 1;
}
return {
"workers": workers,
"submit": func(task) {
let worker_id = len(workers) % size;
return workers[worker_id]["process"](task);
}
};
}
let pool = create_worker_pool(4);
pool["submit"]("task1");
pool["submit"]("task2");Producer-Consumer
let shared_queue = [];
func producer() {
let item = "item_" + tostring(len(shared_queue));
shared_queue = push(shared_queue, item);
log.info("Produced: " + item);
}
func consumer() {
if (len(shared_queue) > 0) {
let item = shared_queue[0];
shared_queue = shift(shared_queue);
log.info("Consumed: " + item);
}
}
// Schedule producer and consumer
cron.schedule("*/1 * * * *", producer);
cron.schedule("*/2 * * * *", consumer);Concurrent HTTP Server
let server = http.createServer();
// Each request is handled independently
server.on("GET", "/", func(req, res) {
// This runs concurrently for multiple requests
log.info("Handling request");
res.writeHead(200, {"Content-Type": "text/html"});
res.end("<h1>Response</h1>");
});
server.listen(3000);
// Server handles multiple requests concurrentlyBest Practices
- Avoid Shared State: Each actor/worker has its own data
- Use Message Passing: Communicate via func calls or queues
- Keep Workers Simple: Each worker does one thing well
- Handle Errors: Isolate failures to individual workers
- Use Queues: Buffer work between producers and consumers
Concurrency Safety
Flowa's design promotes safety:
- No shared mutable state
- func closures for isolation
- Event loop prevents race conditions
- Message passing over shared memory
Next Steps
- Performance - Optimize your code
- Architecture - Understand the runtime