You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Since recently, tokenize fails on a file containing double quotes.
Example
using Tokenize
# Create test file
open("test.txt", "w") do out
print(out, "f(\"data\")")
end
# Display test file
open("test.txt") do io
print(String(read(io)))
end
# Try to tokenize it
open("test.txt") do io
for token in tokenize(io)
println(token)
end
end
Output:
f
(
ERROR: type IOStream has no field data
Stacktrace:
[1] getproperty
@ ./Base.jl:42 [inlined]
[2] emit(l::Tokenize.Lexers.Lexer{IOStream, Tokenize.Tokens.Token}, kind::Tokenize.Tokens.Kind, err::Tokenize.Tokens.TokenError)
@ Tokenize.Lexers ~/.julia/packages/Tokenize/bZ0tu/src/lexer.jl:254
[3] emit
@ ~/.julia/packages/Tokenize/bZ0tu/src/lexer.jl:252 [inlined]
[4] lex_quote(l::Tokenize.Lexers.Lexer{IOStream, Tokenize.Tokens.Token}, doemit::Bool)
@ Tokenize.Lexers ~/.julia/packages/Tokenize/bZ0tu/src/lexer.jl:790
[5] lex_quote(l::Tokenize.Lexers.Lexer{IOStream, Tokenize.Tokens.Token})
@ Tokenize.Lexers ~/.julia/packages/Tokenize/bZ0tu/src/lexer.jl:777
[6] next_token(l::Tokenize.Lexers.Lexer{IOStream, Tokenize.Tokens.Token}, start::Bool)
@ Tokenize.Lexers ~/.julia/packages/Tokenize/bZ0tu/src/lexer.jl:372
[7] next_token
@ ~/.julia/packages/Tokenize/bZ0tu/src/lexer.jl:315 [inlined]
[8] iterate
@ ~/.julia/packages/Tokenize/bZ0tu/src/lexer.jl:102 [inlined]
[9] (::var"#5#6")(io::IOStream)
@ Main ./REPL[38]:3
[10] open(f::var"#5#6", args::String; kwargs::Base.Pairs{Symbol, Union{}, Tuple{}, NamedTuple{(), Tuple{}}})
@ Base ./io.jl:330
[11] open(f::Function, args::String)
@ Base ./io.jl:328
[12] top-level scope
@ REPL[38]:1
Julia version:
Julia Version 1.7.2
Commit bf53498635 (2022-02-06 15:21 UTC)
Platform Info:
OS: Linux (x86_64-pc-linux-gnu)
CPU: Intel(R) Core(TM) i7-1065G7 CPU @ 1.30GHz
WORD_SIZE: 64
LIBM: libopenlibm
LLVM: libLLVM-12.0.1 (ORCJIT, icelake-client)
Tokenize version: v0.5.24
The above example works correctly wiht Tokenize v0.5.21
The text was updated successfully, but these errors were encountered:
Since recently,
tokenize
fails on a file containing double quotes.Example
Output:
Julia version:
Tokenize version: v0.5.24
The above example works correctly wiht Tokenize v0.5.21
The text was updated successfully, but these errors were encountered: