This page is no longer maintained — Please continue to the home page at www.scala-lang.org

Problems with scaladoc discarding part of output

2 replies
Ken McDonald
Joined: 2011-02-13,
User offline. Last seen 42 years 45 weeks ago.
When processing the following comment:
/**Provides the ability to process matched subtrings in different ways, depending on which pattern they matched.
 A `Tokenizer` is constructed with a number of patterns and corresponding functions of type (`MatchResult` => `T`). The Tokenizer is then applied to a string via the `tokenize` method. The final result is of type `Seq[T]`, and is obtained by finding in the input string, sequences that match one of the provided patterns, and using the corresponding function to transform that part of the input.  @param default This function is applied to sections of the input that are not matched by any of the supplied Matchers.
 @param alternatives A sequence of `Matcher -> (MatchResult => T)` tuples. The `Matcher` instances are combined into a single pattern using the `|` operator, but the association between them and the function given with them is maintained.
 Example:{{{  val t = new Tokenizer(    (mr: MatchResult) => "?",    Seq(   Lit("a") -> ((mr: MatchResult) => "1"),   Lit("b") -> ((mr: MatchResult) => "2")  )  )  assert(t.tokenize("fabaabbc").mkString === "?121122?")}}}*/
Scaladoc (the one that accompanies Scala 2.9.1) discards everything from the "@param" tags, including the param tags themselves.
Anyone else seen this behavior? Workarounds?
Thanks,Ken
dcsobral
Joined: 2009-04-23,
User offline. Last seen 38 weeks 5 days ago.
Re: Problems with scaladoc discarding part of output

On Mon, Feb 6, 2012 at 00:46, Ken McDonald wrote:
> When processing the following comment:
>
> /**Provides the ability to process matched subtrings in different ways,
> depending on which pattern they matched.
>
>  A `Tokenizer` is constructed with a number of patterns and corresponding
> functions of type (`MatchResult` => `T`). The
>  Tokenizer is then applied to a string via the `tokenize` method. The final
> result is of type `Seq[T]`, and is obtained
>  by finding in the input string, sequences that match one of the provided
> patterns, and using the corresponding function
>  to transform that part of the input.
>
>  @param default This function is applied to sections of the input that are
> not matched by any of the supplied Matchers.
>
>  @param alternatives A sequence of `Matcher -> (MatchResult => T)` tuples.
> The `Matcher` instances are combined
>  into a single pattern using the `|` operator, but the association between
> them and the function given with them
>  is maintained.
>
>  Example:{{{
>   val t = new Tokenizer(
>     (mr: MatchResult) => "?",
>     Seq(
>   Lit("a") -> ((mr: MatchResult) => "1"),
>   Lit("b") -> ((mr: MatchResult) => "2")
>  )
>   )
>   assert(t.tokenize("fabaabbc").mkString === "?121122?")
> }}}
> */
>
> Scaladoc (the one that accompanies Scala 2.9.1) discards everything from the
> "@param" tags, including the param tags themselves.
>
> Anyone else seen this behavior? Workarounds?

Strange. From my recent read of that code, I know that once it starts
processing tags, *everything* is assigned to the last tag. Still, the
tags should appear, and the example should be on the alternatives
parameter description.

Ken McDonald
Joined: 2011-02-13,
User offline. Last seen 42 years 45 weeks ago.
Re: Problems with scaladoc discarding part of output
Just in case, I tried moving the params block around--it's definitely the param block that's the problem, even removing the {{{}}} block doesn't do anything.
Even odder, in another class in the same library, scaladoc correctly processes an @param, but then silently swallows an @throws and an @note

Ken

Copyright © 2012 École Polytechnique Fédérale de Lausanne (EPFL), Lausanne, Switzerland