Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Incorrect letters in WebGL #4

Open
atheros opened this issue Oct 2, 2015 · 9 comments
Open

Incorrect letters in WebGL #4

atheros opened this issue Oct 2, 2015 · 9 comments

Comments

@atheros
Copy link

atheros commented Oct 2, 2015

There is a bug in font handling or something. In WebGL mode some letters have incorrect glyphs: http://tapiov.net/unicodetiles.js/examples/01-minimal.html

This shows "Hekko voqkd!" instead of "Hello world!" in both Firefox an Chrome.

@tapio
Copy link
Owner

tapio commented Oct 3, 2015

Looks fine here on with Chrome 44b, Firefox 41 as well as Opera 12.16. What are your exact browser versions and operating system?

@atheros
Copy link
Author

atheros commented Oct 3, 2015

I opened http://tapiov.net/unicodetiles.js/examples/01-minimal.html in:
Chrome - version 44.0.2403.130 (64-bit)
Firefox - version 41.0.1
Opera - version 32.0
All on Debian Jessie 64bit

Here is the result: http://dump.bitcheese.net/images/uxowadu/problem.png

@tapio
Copy link
Owner

tapio commented Oct 3, 2015

Thanks, clearly it is not a browser bug since every UA agrees. What is your locale (shooting in the dark here, but it's strange all your browsers agree on the garbage...)

@atheros
Copy link
Author

atheros commented Oct 3, 2015

Same thing on en_US.ISO-8859-1 and en_US.UTF-8. I ofcourse verified those locales are generated.

@atheros
Copy link
Author

atheros commented Oct 4, 2015

I removed the call to cacheChars from WebGLRenderer constructor and called putString(""ABCDEFGHIJKLMNOPQRSTUVWXYZabcdefghijklmnopqrstuvwxyz0123456789,+")

This is the result: http://dump.bitcheese.net/images/ikitone/problem2.png

The problem starts when I add the + to the end of the string (or whatever new character).

On the right side I made the offscreen canvas visible, and it seems the texture generated is correct (except maybe the empty character at the start).

@atheros
Copy link
Author

atheros commented Oct 5, 2015

I've tested on an other computer and it work. Since it was Thinkpad W510 vs. X230 I suggest the most likely difference was the nVidia graphics card on W510. Unfortunatly I cannot confirm intel graphics work on windows currently.

My guess is software GL implementation or Intel HW accelerated on X230 on linux makes the difference.

My guess for software being the cause is because it worked correctly when launched from Windows 7 virtual machine withoug HW acceleration (it worked correctly).

Maybe there is some subtile difference somewhere in shader (as javascript code cannot be different on nvidia vs. non-nvidia env).

@tapio
Copy link
Owner

tapio commented Oct 5, 2015

I tried to investigate a little yesterday, but was still unable to reproduce. I noticed the + sign you told broke it was the 65th character in the atlas, which is kind of curious since one less is a power of two and working (might be coincidence though). Does it always go broken exactly after 64 chars in the atlas?

Two other random things to try:

  • Change the fragment shader precision to "highp" by swapping the line here to "precision highp float;",
  • Force the atlas texture to be big one from the start by changing this line to something like: w = 80;.

@atheros
Copy link
Author

atheros commented Oct 6, 2015

Great, the second option fixes the issue. I wonder how many unique characters are needed now to make it break again :)

I'll try to make some tests later.

@atheros
Copy link
Author

atheros commented Oct 6, 2015

What's interesting, setting w to 8 also works.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants