Why kjs can not rule the web

Dear Planet,

I feel like writing a bit about what I did for kjs, only about the new features (ignore bug fixes for now).

Lets start with the things I already got in a released version:
– Array.isArray
– Object.prototypeOf
– Object.keys
– Object.getOwnPropertyNames
– String.trim, trimLeft, trimRight
Not that much hm?

Now lets look at the Stuff I am trying to get in for 4.9.2:
– JSON.parse
– JSON.stringify
still not much…

And the rest with is implemented but still needs cleanup, or review, or something else to get in:
– Object.getOwnPropertyDescriptor
– Object.defineProperty
– Object.defineProperties (blocked by Object.defineProperty)
– Object.isExtensible (mainly blocked by Object.defineProperty)
– Object.preventExtensible (mainly blocked by Object.defineProperty)
– Object.seal (mainly blocked by Object.defineProperty)
– Object.isSealed (mainly blocked by Object.defineProperty)
– Object.freeze (mainly blocked by Object.defineProperty)
– Object.isFrozen (mainly blocked by Object.defineProperty)
– Object.create (blocked by Object.defineProperty)
– Date.toISOString
– Date.toJSON
Ok, thats a bit more.

With the already present functions that would nearly complete ECMAScript Edition 5. Doesn’t sound bad huh?
But still, with all those new functions (and some bugfixes), kjs has no chance to rule the web-world.
Why? Its not because it is to slow, or some other major bugs that keep websites from working. The reason is much simpler…
Let me show you the problem by (I think) simple js code, taken from battle.net (reduced).

var version;
if (browser == 'ff')
version = /firefox\/([-.0-9]+)/.exec(userAgent);
else if (browser == 'ie')
version = /msie ([-.0-9]+)/.exec(userAgent);
else if (browser == 'chrome')
version = /chrome\/([-.0-9]+)/.exec(userAgent);
else if (browser == 'opera')
version = /opera\/([-.0-9]+)/.exec(userAgent);
else if (browser == 'safari')
version = /safari\/([-.0-9]+)/.exec(userAgent);

UserAgent.version = version[1].substring(0, 1);

“version” is still undefined for khtml/kjs. And trying to access undefined[1] will cause an exception, so thats it for kjs, the exception is not caught and all the following javascript code will never be executed.

So the main reason is, kjs/khtml fails at the browser detection…
This could be solved if they add a sane “else”. Talking about sane “else”, there are really many websites that have an else in this case. But many of them have an insane else, which means hitting non-standard-javascript code, code that only one browser ever supported.

Maybe you are thinking “oh come on.. that only happens on some exotic websites”, and I wish that was the case. But its not. I looked at SOME websites why they don’t work with khtml/kjs, from the javascript point of view, and for the last 30 of them this was the case, and thats for some pretty big websites.

NOTE: Not for all websites its so easy to find the core problem, sometimes it looks very different at the beginning.

So thats it, no matter how fast or how bugfree kjs/khtml will become, it will never work this those websites.

P.S.: I know you can change the browser detection in konqueror, but that is not a solutions for the users.

12 Responses to “Why kjs can not rule the web”

  1. lnfs Says:

    I’d say pretending you’re something else is perfectly fine – as long as you provide all its feature.

    By the way, detecting single browsers is plain stupid, all the cool guys know and teach FEATURE detection is the way to go.
    Personally I’d tell the website owners to stop

    The main problem of kjs is marketing, if lots of people were to start using it web developers will adapt (exactly as they did with chrome)

    BTW, congratulations on the work you do, it’s impressive and always interesting!

  2. Mirosław Zalewski Says:

    I must have missed a point here – what is your proposal? What do you think should be done?

    It is understandable you are feeling frustrated. Opera users feel the same, since their have one of the most standard-compliant browsers out there, yet they see websites stripped of some features (Google, I am looking at you). If they only spoof UserAgent, all these features magically appear…

    In Opera, they have additional, non-standard JS events (before and after script executing) and one huge userscript file that fixes various bugs on websites. Maybe that could be done also for KJS/Konqueror?

  3. m_goku Says:

    Thank you for what you done for KHTML/KJS.
    I’m not quite understand what do you mean by “version” is still undefined for khtml/kjs.

  4. Marc Says:

    Why don’t you just set the identifier to be “firefox” or “chrome” or anything that works?

    Do you really think that there’s *any* website, at all, that does something special when the identifier is “konqueror”?

    I’m trying to be pragmatical here ;)

  5. Patrick Says:

    So a really clever solution would be to detect such cases: Whenever a userAgent is queried, have a look, if konqueror (or anything else using kjs) is an option (supporting “switch” and “if” should cover most cases). If not, just pretend to be something else.

    This is a bit hacky and somehow is against the basic idea in the first place, but web site developers seem to have the lowest programming skills on average (see the aversion against XHTML 2: http://en.wikipedia.org/wiki/XHTML#XHTML_2.0), so try to cope with that.

  6. lamefun Says:


    Such a bug.

    ???!!! O_o

  7. olin Says:

    The solution is to have a configurable list of false identifications for sites that apparently do browser detection and has wrong fall-back. The list can be pre-configured for all major web sites (like news) that need it.

  8. buscher Says:

    @lnfs, @Marc: The problem with pretending to be something else is that, at some point, you also have to implement their bugs. As Detection is used to workaround bugs, or “special behavior”. Which I don’t want. Or rather is may be ok for Javascript, but for html rendering stuff it could be a nightmare.
    Anyway for javascript we don’t really need a special workaround, as kjs is pretty close to the ecmascript standard, pretty close as in nothing I am aware of is different.

    @Patrick: Puh… that kinda sounds like a sane option, but may not be so easy in current kjs, it may also cause more performance problems, and as said some detections are very hard to spot, but still a good option :)

    @olin: Yes, but the web is sooooooo big, that it would be impossible to keep it up2date, after some weeks, month, years it would just cause more problems I think.

  9. Marc Says:

    I rather have a website with some wrong things than not having a website at all (“your browser is not supported”)


    Just my two cents though.

    By the way, thanks for all your hard work! it’s fantastic!

  10. Martin Gräßlin Says:

    @olin: At KWin we were not able to keep a blacklist for three vendors with maybe 20 devices each up to date and completely dropped the idea of blacklist. If it is not possible to have that working for a small domain how should it ever work for a domain like the Web were major sites might ship different JavaScript in the morning and evening?

  11. Stefan Says:

    @Martin: It might still work for the KJS case. The perceived entry level for website feature testing is considerably lower than for graphics driver feature testing, so a crowd-sourced approach might function.

    A larger problem in this case is the much smaller userbase of KJS, compared to Kwin.

  12. moltonel Says:

    How about setting up a crowd-sourced “This website needs UA faking” database ?
    * Let Konqueror default to its own UA
    * If a user manually changes the UA on a website, prompt him “does this website work better now ? Do you want to share your setting ?”
    * Hostname->UA_override counts are stored on some public server.
    * If a javascript error occurs on a page, automatically check the online list to see if many users fixed it with a different UA, and suggest the user to do the same.
    * Provide a “hall of shame” webpage listing hostnames that need UA faking
    * The data on the server should be fully anonymous and purged after a while. Duplicate submissions can be avoided client-side without the need for a “machine id”.
    * This service could actually be browser-agnostic.

Leave a Reply