The library tries to provide pure JavaScript tool(s) to create reactive interfaces using …
# Event-driven programming (3 parts separation ≡ 3PS)
Let's introduce the basic principle on which the library is built. We'll use the JavaScript listener as a starting point.
const onchage=
event=>
console.log("Reacting to the:", event); // A
input.addEventListener("change", onchange); // B
input.dispatchEvent(new Event("change")); // C
As we can see, in the code at location “A” we define how to react when the function is called with any event as an argument. At that moment, we don't care who/why/how the function was called. Similarly, at point “B”, we reference to a function to be called on the event without caring what the function will do at that time. Finally, at point “C”, we tell the application that a change has occurred, in the input, and we don't care if/how someone is listening for the event.
We start with creating and modifying a static elements and end up with UI templates. From document.createElement
to el
.. Then we go through the native events system and the way to include it declaratively in UI templates. From element.addEventListener
to on
.
Next step is providing interactivity not only for our UI templates. We introduce signals (S
) and how them incorporate to UI templates.
Now we will clarify how the signals are incorporated into our templates with regard to application performance. This is not the only reason the library uses scope
s. We will look at how they work in components represented in JavaScript by functions.
import { el } from "./esm-with-signals.js";
import { S } from "./esm-with-signals.js";
const clicks= S(0);
document.body.append(
el().append(
el("p", S(()=>
"Hello World "+"🎉".repeat(clicks())
)),
el("button", {
type: "button",
onclick: ()=> clicks(clicks()+1),
textContent: "Fire"
})
)
);