Skip to content
Snippets Groups Projects
tokenizer.py 42.3 KiB
Newer Older
  • Learn to ignore specific revisions
  •             # XXX EMIT
                self.stream.unget(data)
                self.tokenQueue.append({"type": "ParseError", "data":
                  _(u"Unexpected end of file in bogus doctype.")})
                self.tokenQueue.append(self.currentToken)
                self.state = self.states["data"]
            else:
                pass
            return True