Concurrency with Actors
Flowa uses the Actor Model for safe concurrent programming with message passing instead of shared memory.
Actor Model Principles
- No Shared State: Actors don't share memory
- Message Passing: Communication via messages
- Isolation: Each actor has its own state
- Lock-Free: No mutexes or synchronization primitives
Actors in Flowa
Flowa uses the Actor Model for concurrency. Actors share no memory and communicate via messages.
Defining Actors
Actors are defined using the actor keyword. Actors are like classes but their methods are executed asynchronously in their own task context.
actor Worker {
func init() {
this.value = 0;
}
func process(data) {
print("Processing:", data);
}
func getValue() {
return this.value;
}
}Message Passing
Use spawn to create an actor instance. Method calls on an actor instance are non-blocking and return immediately:
let w = spawn Worker();
w.process("Task 1"); // Asynchronous call
w.process("Task 2"); // Non-blockingActor Features:
- Actors are stateful, concurrent entities
- Methods execute asynchronously in actor's own context
- No shared memory between actors
- Communication via message passing
Event Loop
Flowa's event loop enables non-blocking operations:
- HTTP server requests run concurrently
- Scheduled tasks execute independently
- I/O operations are non-blocking
Concurrent Patterns
Message Queue
let message_queue = [];
func enqueue_message(message) {
message_queue = push(message_queue, message);
log.debug("Queued: " + message);
}
func process_queue() {
while (len(message_queue) > 0) {
let message = message_queue[0];
message_queue = shift(message_queue);
// Process message
log.info("Processing: " + message);
}
}
// Schedule periodic processing
cron.schedule("*/1 * * * *", process_queue);Worker Pool
func create_worker_pool(size) {
let workers = [];
let i = 0;
while (i < size) {
let worker = create_worker(i);
workers = push(workers, worker);
i = i + 1;
}
return {
"workers": workers,
"submit": func(task) {
let worker_id = len(workers) % size;
return workers[worker_id]["process"](task);
}
};
}
let pool = create_worker_pool(4);
pool["submit"]("task1");
pool["submit"]("task2");Producer-Consumer
let shared_queue = [];
func producer() {
let item = "item_" + tostring(len(shared_queue));
shared_queue = push(shared_queue, item);
log.info("Produced: " + item);
}
func consumer() {
if (len(shared_queue) > 0) {
let item = shared_queue[0];
shared_queue = shift(shared_queue);
log.info("Consumed: " + item);
}
}
// Schedule producer and consumer
cron.schedule("*/1 * * * *", producer);
cron.schedule("*/2 * * * *", consumer);Concurrent HTTP Server
let server = http.createServer();
// Each request is handled independently
server.on("GET", "/", func(req, res) {
// This runs concurrently for multiple requests
log.info("Handling request");
res.writeHead(200, {"Content-Type": "text/html"});
res.end("<h1>Response</h1>");
});
server.listen(3000);
// Server handles multiple requests concurrentlyBest Practices
- Avoid Shared State: Each actor/worker has its own data
- Use Message Passing: Communicate via func calls or queues
- Keep Workers Simple: Each worker does one thing well
- Handle Errors: Isolate failures to individual workers
- Use Queues: Buffer work between producers and consumers
Concurrency Safety
Flowa's design promotes safety:
- No shared mutable state
- func closures for isolation
- Event loop prevents race conditions
- Message passing over shared memory
Next Steps
- Performance - Optimize your code
- Architecture - Understand the runtime