Ajax vs. SOA

While surfing the web, I happened to find a blog entry by Dion Hinchcliffe, containing some interesting observations on the conflict between Ajax and SOA, related to my recent discussion with Dag König. The most interesting part is the enumeration at the end of the post, under the heading "SOA Implications".

Ha, Dion’s blog is the first one I’ve seen that looks like a OneNote notebook! That’s cool. Well, reasonably cool, at least.

Advertisement

New Official Release of RSS Bandit

Finally, the "Nightcrawler" release of RSS Bandit has been released. Looks good, so far! The only obvious thing that I can see doesn’t work is communicating the "read" status with Newsgator Online. The right feeds get synchronized, but it looks as if I haven’t read any items.

I hope they’ve fixed an annoying thing with "newspaper view": if a refresh of all feeds occurred while I was reading a group of feeds in newspaper view, that view just vanished, and I had to start reading that view from scratch again (because items might have been added). Doing "mark as read" when I started reading in newspaper view wasn’t a solution either, because I didn’t know if I’d be lucky not to get a refresh operation while reading. It isn’t an easy problem, because if the newspaper view would stay, while the feeds were refreshed, marking all items as read should only mark those in the newspaper view, or I would be forced to mark every item as read, individually. I haven’t checked through their list of fixed issues yet, but I’ll tell you later whether it works satisfactorily or not.

Update 29 Nov. 2005: They have indeed fixed the bug, not in the way I thought, but in an even better (less complicated) way. The newspaper view isn’t removed when new items arrive, so I can mark everything as read before I start reading. Then if new items arrive, I won’t see them in view, but they will be unread, so I’ll see them the next time I read from that category.
I just have to remember to press Ctrl-Q (mark all as read) before reading, and not to do it after reading.

Comments on Ajax versus SOA and Smart Clients

For those of you who know Swedish, there’s some interesting discussion (post 1, post 2, with comments) at Dag König’s blog, related to my previous two posts (post 1, post 2) about Ajax versus Smart Clients. For those who don’t, here’s a summary from my perspective:

In his first post, Dag says that the internal email sent by Bill Gates recently fits very well into the SOA view of developing software.

Then, I comment on this, stating that the Web 2.0 way of developing applications rather focuses on  more "lightweight" technologies (since SOA development is usually viewed as more "heavyweight"), and Microsoft "turning ship" could instead mean that they are  turning away from their SOA focus (they won’t skip it, I’m sure). But I say that probably Microsoft will probably win this battle anyway, due to their effort on Smart Clients: those will be needed to make use of the Web 2.0 services when we’re not online (that still happens, and will happen to many people, occasionally during the coming few years).

Dag replies in a second post that when the Web 2.0 services will be used for more important services, we’ll see that more of a SOA view (and associated technologies) will be necessary (several Web 2.0 services are publishing external APIs). And that Microsoft will get to show its advantages (they’ve invested a lot into SOA) when that time comes. In addition, it’s possible to be online more and more of the time, so the importance of offline clients decreases. But nevertheless, there are technologies that bridge the two worlds: click-once and Java applets.

My comment to that post is that I agree that when for example digital identity management will be required for those services, we’ll probably see more heavyweight protocols being used. But there’s no immediate pressure for using SOA at the moment. And that those bridging technologies are actually the smart clients (where Microsoft will get its payback, as commented in my previous post).

End of discussion, so far. Looking forward to more discussions of this kind! 🙂

Spam and Evolution

I often catch myself trying to find what’s positive even about the most negative things. Take spam and phishing, for example. Bad, bad. But due to those, we’re getting good things like systems for handling digital identity, so that we can safely keep track of our digital secrets, like our credit cards. That’s not useful only for stopping spam and phishing, it makes us trust the net more, so that people will use and make more services, and the net will grow.

And those hackers exploiting security holes in Windows, commanding armies of zombie machines to send spam all over the planet. Bad. But didn’t that make Microsoft finally focus on security? And it isn’t good only for stopping hackers, it will make the OS more stable and better functionally partitioned (I hope).

A parallel from evolution: if the climate had been always hot everywhere on the planet, the dinosaurs wouldn’t ever have gotten feathers. But it wasn’t, and feathers enabled them to evolve into birds, since feathers could be used not only for keeping them warm.

Over and over again, you see lots of examples where an evolutionary pressure causes something to evolve, that can be used for something else later. The same things are happening on the net, only quicker. But people are still complaining that finding a solution to the spam problem takes a long time!

So in the long run, spam is good. Just trust (and join!) the internet evolution.