I am curious whether or not node scales more efficiently than the .NET
Provider in terms of concurrent users... I'm just curious which handles
concurrent users more efficiently in terms of CPU%.
All mainstream web application architectures scale by adding more cores and
memory and distributing (load balancing) "requests" across pools of
application server instances, and occasionally across pools of virtual
machine instances.
You're smart to ask about CPU efficiency, IMHO. And the other concern is
how much additional time system administrators will need to allocate to
"managing" distributed computing architectures.
Regarding CPU efficiency, you should have a wake-up call if you add 5250
replacement to the comparison. Any of the web application interfaces we've
mentioned in this thread (Java, PHP, Python, Ruby, Node.js, MS .Net) will
increase CPU usage by at least 3,000% over comparable 5250 interfaces.
5250 interfaces that consume 3 milliseconds of CPU time will consume a
minimum of 90 milliseconds after being converted to use a browser user
interface. Web interfaces often consume 200-300 milliseconds of CPU time to
generate "an HTML page".
The question of comparing CPU efficiency of Node.js vs. MS .Net is hard due
of the lack of benchmarks. Anecdotal evidence suggests that interpretive
scripting environments will consume several times more CPU than compiled
applications. But that could be ameliorated by reducing the amount of
"work" performed by Node.js or MS .Net by moving it to run in the IBM i
native environment.
Mike Pavlak of Zend has a story about a customer which was using XMLSERVICE
to run the ADDLIBLE command from PHP and complained about the performance.
Rather than "evoke" the "toolkit" for each library in the library list,
Mike suggested calling a CL program once to "add" all the needed libraries.
That fixed the performance problem.
As an Amazon Associate we earn from qualifying purchases.