check.rkt:
Added the actual check-match macro.
test.rkt:
Just a provide statement
check-test.rkt:
7 additional tests for check-match, and a macro to help create tests
check.scrbl:
Documentation and examples for check-match
It was pulling from `scheme/gui/base', instead. The one from `scheme/gui/base'
is now different and still pulls from `scheme/gui/base'.
This could break some programs that accidentally depended on `scheme/gui/base'
exports from `gui-dynamic-require', but it's more likely to fix problems.
The `scheme/base' module had become unreachable from the `mred' module.
While that normally would be a good thing, it lead to troublesome
multiple instantiations of `scheme/base' that caused problems for
attaching further modules to the namespace.
inside the same collection so this file can (when other
things aren't too different) be used in a version of racket
that doesn't generally have the tests
Track fixnum results in the same way as flonum results to enable
unboxing, if that turns out to be useful. The intent of the change,
though, is to support other types in the future, such as "extnums".
The output `raco decompile' no longer includes `#%in', `#%flonum',
etc., annotations, which are mostly obvious and difficult to
keep in sync with the implementation. A local-binding name now
reflects a known type, however.
The change includes a bug repair for he bytecode compiler that
is independent of the generalization (i.e., the new test case
triggered the old problem using flonums).
The `lazy-require' form expands to `define-runtime-module-path-index',
whch doesn't work right at phase levels other than 0. Work around the
problem by generating a submodule to hold the
`define-runtime-module-path-index' form.
This repair fixes `raco exe' on certain uses of `match', which in turn
uses `lazy-require' at compile time.
Also, use `register-external-module' to generate appropriate
dependencies on lazily loaded modules.
The JIT was pessimistically using 64-bit jumps for long branches
or any jump between code that is allocated at different times.
Normally, though, code allocation stays within the same 32-bit
range of the heap, so stick to 32-bit jumps until forced by
allocation addresses to use 64-bit jump targets.
Commit ffe45ecce had introduced a regression with some
polymorphic functions imported between typed modules due to
miscommunicated variance information.
correct-xexpr?. Inverted the logic and replaced the
continuation-passing style with simpler test-for-error logic. Also
corrected typo in attribute symbol checker that could otherwise lead
to a contract error. (taking the cadr of a non-cadrable value)
I started from tabs that are not on the beginning of lines, and in
several places I did further cleanings.
If you're worried about knowing who wrote some code, for example, if you
get to this commit in "git blame", then note that you can use the "-w"
flag in many git commands to ignore whitespaces. For example, to see
per-line authors, use "git blame -w <file>". Another example: to see
the (*much* smaller) non-whitespace changes in this (or any other)
commit, use "git log -p -w -1 <sha1>".
In `(if (pair? x) E1 E2)', convert `(car x)' in E1 to
`(unsafe-car x)', and similarly for `(cdr x)'. Also,
`(begin (car x) (cdr x))' converts to `(begin (car x)
(unsafe-cdr x))' since `(car x)' implies a `pair?' test
on `x'.
typing them in, in the module language test suite
this speeds it up; going from 140 to 105 seconds on my (mac)
machine. (drdr was taking 240 or so seconds, tho)
cannot change its revision number during reading
This restriction was enforced only for editors that have non
string-snip% snips. The restriction was in place because the
implementation strategy was to chain thru the snips in the editor
using (send snip next) and that isn't safe if the revision number
changes.
The lifting of the restriction is implemented by tracking the position
in the editor where the last snip ended and, if the revision number
changes, starting over trying to get a snip from that position. This
has the effect that, if the revision number never changes, the code
should behave the same as it was doing before (so hopefully any new
bugs I've introduced in this commit will only show up if the old
implementation would have raised an error)
Also, exploit the lifting of this restriction in the colorer so it
doesn't to restart the port during to coloring that happens along with
the parsing
Shape information allows the linker to check the importing
module's compile-time expectation against the run-time
value of its imports. The JIT, in turn, can rely on that
checking to better inline structure-type predicates, etc.,
and to more directy call JIT-generated code across
module boundaries.
In addition to checking the "shape" of an import, the import's
JITted vs. non-JITted state must be consistent. To prevent shifts
in JIT state, the `eval-jit-enabled' parameter is now restricted
in its effect to top-level bindings.
so that it waits until online check syntax actually
finishes (otherwise, there actually is a leak;
the link is broken when the message comes back from the
other place)
This tracking allows the compiler to treat structure sub-type
declarations as generating constant results, and it also allows
the compiler to recognize an applications of a constructor or
predicate as functional.
The JIT takes advantage of known-constant bindings to avoid the
check that a variable is still bound to a structure predicate,
selector, or mutator; that makes the code short enough to really
inline. The inlined version takes about half the time of the
indirect version.
The compiler does not yet track bindings precisely enough to
recognize constants for sub-type declarations.
This fixes several issues:
- `Parameter` generates impersonator contracts correctly
- `Any` handling now copies immutable data when possible
- `Any` now recognizes more atomic base types
Merge to 5.3.1.
This reverts commit d39780a130.
Matthew says this test is really about TCP, so it should not be
changed. Although perhaps we can use a more basic TCP test to check if
this should be done.
Turn use of a finalized ffi callout into a reported error,
instead of a crash. Clarify the existence of the finalizer
in the docs. Fix error logging of the finalizer thread.
Merge to v5.3.1
Bytecode changes in two small ways to help the validator:
* a cross-module variable reference preserves the compiler's
annotation on whether the reference is constant, fixed, or other
* lifted procedures now appear in the module body just before the
definitions that use them, instead of at the beginning of the
module body
The new argument gets to chaperone/impersonate a guard at
the prompt, and it is applied when the continuation is applied ---
based on a wrapper on th prompt tag of the continuation (as opposed to
the prompt tag of the prompt).
The new argument gets to filter results that come from a
non-composable continuation that replaces one delimited
by a prompt using the chaperoned/impersonated prompt tag.
When thie JIT guesses that an identifier is bound to a
structure predicate, getter, setter, etc., but that guess
turns out to be wrong, and the call is in a tail position,
then preserve tail-call behavior.
(Changes include some setup to inline structure constructors.)
The contract now has two major differences:
- It raises an error when it would have to wrap.
- It uses chaperones to delay errors as long as possible
In general, using `Any` as a type when exporting to untyped
code will now just work, unless the untyped code tries to
communicate values back to the typed side, in which case an
immediate error will be raised.
Much of the implementation comes from the membrane design
from [Strickland et al, OOPSLA 2012].
raw-module-path inside of a phaseless-spec (see
the #%require docs for the description of these).
Also, Rackety
in conjunction with commit 9047427 (and an earlier
commit in those files/dirs), this commit:
closes PR 7815
closes PR 10455
closes PR 10788
(the way things currently stand, check syntax needs more information
from the fully expanded form, but at least now it has a better chance
to actually use that information, if it were there ...)
related to PR 7815
related to PR 10455
related to PR 10788
An attempt to detect a submodule could trigger the original module
name resolver when the would-be enclosing module would be handled
by the embedding-specific resolver. When a submodule is not found
but its would-be enclosing module is embedded, then assume that
the default resolver wouldn't find the submodule, eithe --- and
therefore avoid a potential "collection not found" error.
Also, add 'lsquo as allowed content.
Omitting the ` conversion in the first place was over-conservative.
There's a backward-compatibility issue with this addition (i.e., a
document might contain a backquote in a decoded context that is
meant to be rendered as a backquote), but the potential problems
seem minor.
The `slideshow/code-pict' library is the same as `slideshow/code', but
it works in non-GUI settings. Only the `slideshow/code' library connects
the code font size to `current-font-size', though.
The `code' macro, `define-code', etc., now support "code transformers",
which are syntax bindings that trigger otherwise-unescaped transformations
in the code to typeset (which can make the code easier to read and
friendlier to auto-indentation).
Other major changes:
- pg code now uses only binary format
- pg timestamptz now always UTC (tz = 0), added doc section
- added contracts to most pg "can't-convert" errors
Support for break clauses complicates expansion to `for/fold/derived';
a new `syntax/for-body' library provides a helper for macros that need
to split a `for'-style body into a prefix part and wrappable part.
Allows the use of `in-generator' to produce multiple values in a
position other than immediately within `for' (where the arity
can be inferred).
Closes PR 11662
The new parameter (and supporting environment variables and
command-line flags) can bytecode lookup to a tree other than
where a source file resides, so that sources and generated
compiled files can be kept separate. It also supports storing
bytecode files in a version-specific location (either with
the source or elsewhere).
The `make-log-receiver' function now includes a logger-name
filter. This filter is implemented as a low enough level that
it affects `log-level?' tests to check whether a log message
needs to be constructed at all.
The -W and -L flags and PLTSTDERR and PLTSYSLOG environment variables
support filters of the form "<level> <level>@<name> ...", where
<level>@<name> specializes filtering of events for a logger whose
name matches <name> to show <level> and higher.
The old `cast' didn't work right for a mismatch between
a pointer GCableness and the source or target types, and
it didn't work right for an GCable pointer with a non-zero
offset. While those pitfalls were documented, the first
of them definitely has been a source of bugs in code that
I wrote.
Also added `cpointer-gcable?'
Add `file-position*', which can return #f instead of raising
an exception when a port's position is unknown. Change
`make-input-port' and `make-output-port' to accept more
kinds of values as the initial position.
These changes make it possible to synchronize a port's
position with a `port-commit-peeked' action. It's ugly,
which I think reflect something broken about position
tracking in the port protocol (which seems difficult to fix
without breaking compaibility).
Providing a port instead of a reading or writing procedure
redirects the read/write to the specified port. This shortcut
is kind of a hack, but the run-time system can easily streamline
the redirection when it's exposed this way.
Using the new redirection feature reduces overhead in
`with-output-to-bytes' and `pretty-print'.
We can't disallow the creation of bad mutators without breaking
old code, but we can prevent the JIT from treating them like
good ones.
Closes PR 13062
Stream generic operations stopped working for lists
since the operations used only the generic dispatcher
instead of the real generic functions.
(Moral of this story: write more tests)