Display invisible characters in vim
When I compile my source with Latex, I get the following error message
Unicode char \u8: not set up for use with LaTeX.
Now I suspect that this is due to an invisible character. The command :set list
doesn't show anything suspicious and :set display+=uhex
doesn't seem to work for me. This post lets me assume that I also have something fishy going on.
Is there a way in vim to show everything that is in my file that is not a printable character? I am using MacVim Version 7.3 (53).
Solution 1:
you could:
:setlocal display=uhex
to display non-ascii chars by their hex-number. and you might also try to highlite non-printable chars by:
:set hlsearch
/\(\p\|$\)\@!.
then there is the isprint
option which controls which chars are 'printable'.
if that does not help you might want to :%!xxd
and check byte by byte if there is something 'fishy' around the spot you encounter the problem.
Solution 2:
Vim usually shows something for every character in the file, except spaces, tabs and end-of-line sequences or characters. I don't think this is a hidden character problem; I think this is a file encoding problem. I think Vim is using UTF-8 to encode some characters in your file and LaTeX is expecting ASCII or Latin1. (I use ASCII almost exclusively, so I'm certainly not an expert in other encodings.)
To check the encoding that Vim is using, execute
:set enc?
My guess that that will return "utf-8". One solution might be to save the file with Latin1 encoding instead. To do that, execute
:set fenc=latin1
:w
If instead you want to look for any non-ASCII characters in the file and change them where needed, search for characters in the range 0x80 to 0xff using
/[\x80-\xff]
To find out more about Vim's use of different encodings, see
:help enc
:help fenc
:help 45.3
Solution 3:
An alternative to using vim is to run
tr -d '[a-zA-Z0-9!#@_?+ \t\n\\()"^~`%-]'\'{} < your_latex_file.tex | hexdump -c
This should give you information of what characters that are not within the normal range of printable characters.