String._extension._transform

trait _transform[A <: String | String]
class Object
trait Matchable
class Any

Def

@targetName("char_Stream")
def char_~: ~

Source of Chars

Source of Chars

Returns String as a Source of [Char]] "abcd".char_~ tp // Prints ~(a, x, c, d)

Source
_transform.scala
inline def indent(tag: A | String): A

Indents text with the lines

Indents text with the lines

Indents text with the tag.

Prefixes first line with the tag, other lines with tag equal space

```
   "abc

de xyz".indent("Idxs: ").tp // Output Idxs: abc de xyz ```

Source
_transform.scala
@targetName("line_Stream")
def line_~: ~[A]

Source of lines

Source of lines

Creates a [[Source]] of Strings representing lines (delimited by '

') of this text

```  "abc

def xyz".line_~.tp // Prints ~(abc, def, xyz) ```

Source
_transform.scala
@targetName("nonEmpty_Opt")
def nonEmpty_?: Opt[A]
@targetName("split_Stream")
def split_~(separator: Char): ~[A]
@targetName("split_Stream")
def split_~(separators: Array[Char]): ~[A]
@targetName("toBoolean_Opt")
@targetName("toBoolean_Result")
def toBoolean_??: Result[Boolean]

Boolean result conversion

Boolean result conversion

Converts String to Boolean result

 "true".toBoolean_?? tp

 "abc".toBoolean_?? tp

 // Output
 Result(true)
 Result(failure=For input string: "abc")
Source
_transform.scala
@targetName("toDouble_Opt")
@targetName("toDouble_Result")
def toDouble_??: Result[Double]

Double result conversion

Double result conversion

Converts String to Double result

 "123.45".toDouble_?? tp

 "abc".toDouble_?? tp

 // Output
 Result(123.45)
 Result(failure=For input string: "abc")
Source
_transform.scala
@targetName("toInt_Opt")
def toInt_?: Opt
@targetName("toInt_Result")
def toInt_??: Result[Int]

Int result conversion

Int result conversion

Converts String to Int result

 "123".toInt_?? tp

 "abc".toInt_?? tp

 // Output
 Result(123)
 Result(failure=For input string: "abc")
Source
_transform.scala
@targetName("tokenized_Stream")
def tokenized_~(separators: ~[A]): ~[(A, <>, A)]

Source of tokens

Source of tokens

 Multi token tokenizetion

 Returns a Tuple including:

   - Separator preceding the token, empty for the first token
   - [[!.Range]] of the token in the text
   - String token

 ```
    val str: String = (1 <> 40).~.makeString()
    "Text to Tokenize:".tp.tp
    str.tp.tp
    ("Token", "Range", "String").tp
    str.replace("

", "").tokenized~(Stream("000","111","222","333","444")).tp

    // Output

    Text to Tokenize:

    12345678910111213141516171819202122232425262728293031323334353637383940

    (Token,Range,String)
    --- --------- -------------------
    ?   ?         ?
    --- --------- -------------------
        0 <>> 11  12345678910
    111 14 <>> 33 2131415161718192021
    222 36 <>> 55 3242526272829303132
    333 58 <>> 71 4353637383940
    --- --------- -------------------
 ```
Value Params
separators

a Source of text separators to consider

Source
_transform.scala
@targetName("toLong_Opt")
@targetName("toLong_Result")
def toLong_??: Result[Long]

Long result conversion

Long result conversion

Converts String to Long result

  "123".toLong_?? tp

  "abc".toLong_?? tp

  // Output
  Result(123)
  Result(failure=For input string: "abc")
Source
_transform.scala