I am very happy to hear that HCL invests in Domino and improves the existing technology stack. But as a German, I have to be sceptical (it’s in our genes), because I can not see any advantage in the integration of node.js applications on top of Domino. I have written a demo two years ago, just to prove that it would be possible.
One of the main reasons is that I have switched my application architecture, which means that Domino is nothing more than a great NoSQL-Datacontainer. While the existing REST APIs were absolutly not fitting my requirements (too slow, painfull output and not expandable), I have pursued „my own way“ by using Spring Boot as my preferred technology. This made me independent from IBMs release cycles, and since the Java 8 upgrade I am happy, because I was able to add the missing parts which where never delivered by IBM.
Token authentication? Solved by creating my own solution. Performance? Boosted with Memcache. Memory limitations? Also solved with Memcache. Delay of agent execution? Solved with Spring Boot. I have dropped the Designer and using Eclipse directly, especially development/maintenance of legacy Java agents makes a lot of more fun. Code analysis / quality? Maven, JUnit & SonarQube are your friends. SSL encryption? Nginx. And the list grows and grows…
My point is that beeing independet from IBMs releases allows me to be extremly flexible – which IBM is not. Just have a look at Bootstrap and XPages: I have created my own renderers, and I can switch to the latest version with a few clicks (as long as there is no fundamental change in the structure). I am not dependent that – maybe – in the future someone will adopt the version to the XPages Extension library. If one of my customers wants to use it, OK, no problem.
That‘s what my customers love: The sky (aka budget) is the limit.
And here comes the problem I see with the node.js integration: The release cycles are extremely short. Just have a look at the release list:
In the last 8 monthes there where 10(!) new versions for Carbon (V8, LTS). 26 versions since 2016 for Boron (V6, LTS). And that’s only node.js – the core of the whole thing. Don’t forget the packages and their dependencies. Let’s skip the fundamental problems with the NPM ecosystem: If it is required to get the latest updates, „npm update -g“ and everything is fine.
But waiting for Big Blue for a hot fix? If the „Domino NPM Package“ is not updated, but depends on an older version, you maybe cannot update the whole application. Ever had problems with the old Jar files of the Domino JVM? Or was it required to downgrade the Eclipse Maven Plugin to run with Domino’s JRE 6? Just think about it…
In my eyes there is no reason for tying node.js with Domino. My advice is to build REST interfaces on top of Domino (and reuse the existing business logic), and access it with a separate application based on [enter your preferred technologie here] with a backend connector. The frontend can be realised by a web development team / company. This takes a lot pressure off the existing Domino environment (from the management perspective): You can build new applications with the current hipster technology, can find developers and administrators, and the costs for moderinzation are not as high as a migration. After taking this path, some customers who abandoned Domino years ago, are investing again in the product.
So far I am still open for a big surprise and hopefully HCL can convince me of the contrary.
From what has been said at conferences, although Node.js will come as a sidecar to Domino, people won’t be tied to a particular version and can upgrade independently, which to be honest companies should. The ones who will get the most benefit going forward will be those who take a different approach to application development than traditionally. You’ve certainly been a trailblazer for many of us.
And I agree completely about XPages. The extensibility of XPages and the Domino server has been a huge advantage and open sourcing of the Extension Library has enabled the community to do some great work both contributing to the project (though that has stalled a little) and building around it.
Don’t get me wrong: I have not followed the discussion before writing this post nor have access to the Domino 10 Beta, but as I mentioned in this post I am willing to get impressed by HCL how they will solve the integration.
This is not a „bashing“ or should be a negative comment about the idea to integrate it. I had done an integration before, but then I have seen that this does not solve the problem to get new developers for Domino, because they have no idea about Domino & its capabilities or structures.
I had a lot of discussion about „Reader fields“, „Categories“, „Rich Text“ and all the other stuff which are daily business for us. But for students fresh from university it is sorcery and hellish stuff which they don’t understand.
But give them a REST API with clear endpoints, and they are happy and building great applications.
I think the integration of node.js into Domino will be a good thing. However I too have my skepticism about IBM or HCL being able to fix the Domino modules in a timely manner when a problem arises. Or to stay in the game for a long time in order to maintain the modules.
Currently if we want a node.js solution for anything dealing with Domino we write something that consumes DDS or some custom REST service and then passes that data along. It will be nice to be able to consume Domino data directly without having to figure out a REST service for my REST service which introduces just another layer of technology.
Great article, really made me think of why I was excited about node.js coming to Domino.
Pingback: node node.js, domino-db & Docker (12): DominoDB and a big NO-NO? | email@example.com