mirror of
https://salsa.debian.org/mdosch/feed-to-muc.git
synced 2024-11-22 22:18:39 +01:00
Updated vendor packages.
This commit is contained in:
parent
54c5bfce0b
commit
29d2cfa466
106 changed files with 24080 additions and 0 deletions
12
vendor/github.com/PuerkitoBio/goquery/LICENSE
generated
vendored
Normal file
12
vendor/github.com/PuerkitoBio/goquery/LICENSE
generated
vendored
Normal file
|
@ -0,0 +1,12 @@
|
||||||
|
Copyright (c) 2012-2016, Martin Angers & Contributors
|
||||||
|
All rights reserved.
|
||||||
|
|
||||||
|
Redistribution and use in source and binary forms, with or without modification, are permitted provided that the following conditions are met:
|
||||||
|
|
||||||
|
* Redistributions of source code must retain the above copyright notice, this list of conditions and the following disclaimer.
|
||||||
|
|
||||||
|
* Redistributions in binary form must reproduce the above copyright notice, this list of conditions and the following disclaimer in the documentation and/or other materials provided with the distribution.
|
||||||
|
|
||||||
|
* Neither the name of the author nor the names of its contributors may be used to endorse or promote products derived from this software without specific prior written permission.
|
||||||
|
|
||||||
|
THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT HOLDER OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
|
181
vendor/github.com/PuerkitoBio/goquery/README.md
generated
vendored
Normal file
181
vendor/github.com/PuerkitoBio/goquery/README.md
generated
vendored
Normal file
|
@ -0,0 +1,181 @@
|
||||||
|
# goquery - a little like that j-thing, only in Go
|
||||||
|
[![build status](https://secure.travis-ci.org/PuerkitoBio/goquery.svg?branch=master)](http://travis-ci.org/PuerkitoBio/goquery) [![GoDoc](https://godoc.org/github.com/PuerkitoBio/goquery?status.png)](http://godoc.org/github.com/PuerkitoBio/goquery) [![Sourcegraph Badge](https://sourcegraph.com/github.com/PuerkitoBio/goquery/-/badge.svg)](https://sourcegraph.com/github.com/PuerkitoBio/goquery?badge)
|
||||||
|
|
||||||
|
goquery brings a syntax and a set of features similar to [jQuery][] to the [Go language][go]. It is based on Go's [net/html package][html] and the CSS Selector library [cascadia][]. Since the net/html parser returns nodes, and not a full-featured DOM tree, jQuery's stateful manipulation functions (like height(), css(), detach()) have been left off.
|
||||||
|
|
||||||
|
Also, because the net/html parser requires UTF-8 encoding, so does goquery: it is the caller's responsibility to ensure that the source document provides UTF-8 encoded HTML. See the [wiki][] for various options to do this.
|
||||||
|
|
||||||
|
Syntax-wise, it is as close as possible to jQuery, with the same function names when possible, and that warm and fuzzy chainable interface. jQuery being the ultra-popular library that it is, I felt that writing a similar HTML-manipulating library was better to follow its API than to start anew (in the same spirit as Go's `fmt` package), even though some of its methods are less than intuitive (looking at you, [index()][index]...).
|
||||||
|
|
||||||
|
## Table of Contents
|
||||||
|
|
||||||
|
* [Installation](#installation)
|
||||||
|
* [Changelog](#changelog)
|
||||||
|
* [API](#api)
|
||||||
|
* [Examples](#examples)
|
||||||
|
* [Related Projects](#related-projects)
|
||||||
|
* [Support](#support)
|
||||||
|
* [License](#license)
|
||||||
|
|
||||||
|
## Installation
|
||||||
|
|
||||||
|
Please note that because of the net/html dependency, goquery requires Go1.1+.
|
||||||
|
|
||||||
|
$ go get github.com/PuerkitoBio/goquery
|
||||||
|
|
||||||
|
(optional) To run unit tests:
|
||||||
|
|
||||||
|
$ cd $GOPATH/src/github.com/PuerkitoBio/goquery
|
||||||
|
$ go test
|
||||||
|
|
||||||
|
(optional) To run benchmarks (warning: it runs for a few minutes):
|
||||||
|
|
||||||
|
$ cd $GOPATH/src/github.com/PuerkitoBio/goquery
|
||||||
|
$ go test -bench=".*"
|
||||||
|
|
||||||
|
## Changelog
|
||||||
|
|
||||||
|
**Note that goquery's API is now stable, and will not break.**
|
||||||
|
|
||||||
|
* **2018-11-15 (v1.5.0)** : Go module support (thanks @Zaba505).
|
||||||
|
* **2018-06-07 (v1.4.1)** : Add `NewDocumentFromReader` examples.
|
||||||
|
* **2018-03-24 (v1.4.0)** : Deprecate `NewDocument(url)` and `NewDocumentFromResponse(response)`.
|
||||||
|
* **2018-01-28 (v1.3.0)** : Add `ToEnd` constant to `Slice` until the end of the selection (thanks to @davidjwilkins for raising the issue).
|
||||||
|
* **2018-01-11 (v1.2.0)** : Add `AddBack*` and deprecate `AndSelf` (thanks to @davidjwilkins).
|
||||||
|
* **2017-02-12 (v1.1.0)** : Add `SetHtml` and `SetText` (thanks to @glebtv).
|
||||||
|
* **2016-12-29 (v1.0.2)** : Optimize allocations for `Selection.Text` (thanks to @radovskyb).
|
||||||
|
* **2016-08-28 (v1.0.1)** : Optimize performance for large documents.
|
||||||
|
* **2016-07-27 (v1.0.0)** : Tag version 1.0.0.
|
||||||
|
* **2016-06-15** : Invalid selector strings internally compile to a `Matcher` implementation that never matches any node (instead of a panic). So for example, `doc.Find("~")` returns an empty `*Selection` object.
|
||||||
|
* **2016-02-02** : Add `NodeName` utility function similar to the DOM's `nodeName` property. It returns the tag name of the first element in a selection, and other relevant values of non-element nodes (see godoc for details). Add `OuterHtml` utility function similar to the DOM's `outerHTML` property (named `OuterHtml` in small caps for consistency with the existing `Html` method on the `Selection`).
|
||||||
|
* **2015-04-20** : Add `AttrOr` helper method to return the attribute's value or a default value if absent. Thanks to [piotrkowalczuk][piotr].
|
||||||
|
* **2015-02-04** : Add more manipulation functions - Prepend* - thanks again to [Andrew Stone][thatguystone].
|
||||||
|
* **2014-11-28** : Add more manipulation functions - ReplaceWith*, Wrap* and Unwrap - thanks again to [Andrew Stone][thatguystone].
|
||||||
|
* **2014-11-07** : Add manipulation functions (thanks to [Andrew Stone][thatguystone]) and `*Matcher` functions, that receive compiled cascadia selectors instead of selector strings, thus avoiding potential panics thrown by goquery via `cascadia.MustCompile` calls. This results in better performance (selectors can be compiled once and reused) and more idiomatic error handling (you can handle cascadia's compilation errors, instead of recovering from panics, which had been bugging me for a long time). Note that the actual type expected is a `Matcher` interface, that `cascadia.Selector` implements. Other matcher implementations could be used.
|
||||||
|
* **2014-11-06** : Change import paths of net/html to golang.org/x/net/html (see https://groups.google.com/forum/#!topic/golang-nuts/eD8dh3T9yyA). Make sure to update your code to use the new import path too when you call goquery with `html.Node`s.
|
||||||
|
* **v0.3.2** : Add `NewDocumentFromReader()` (thanks jweir) which allows creating a goquery document from an io.Reader.
|
||||||
|
* **v0.3.1** : Add `NewDocumentFromResponse()` (thanks assassingj) which allows creating a goquery document from an http response.
|
||||||
|
* **v0.3.0** : Add `EachWithBreak()` which allows to break out of an `Each()` loop by returning false. This function was added instead of changing the existing `Each()` to avoid breaking compatibility.
|
||||||
|
* **v0.2.1** : Make go-getable, now that [go.net/html is Go1.0-compatible][gonet] (thanks to @matrixik for pointing this out).
|
||||||
|
* **v0.2.0** : Add support for negative indices in Slice(). **BREAKING CHANGE** `Document.Root` is removed, `Document` is now a `Selection` itself (a selection of one, the root element, just like `Document.Root` was before). Add jQuery's Closest() method.
|
||||||
|
* **v0.1.1** : Add benchmarks to use as baseline for refactorings, refactor Next...() and Prev...() methods to use the new html package's linked list features (Next/PrevSibling, FirstChild). Good performance boost (40+% in some cases).
|
||||||
|
* **v0.1.0** : Initial release.
|
||||||
|
|
||||||
|
## API
|
||||||
|
|
||||||
|
goquery exposes two structs, `Document` and `Selection`, and the `Matcher` interface. Unlike jQuery, which is loaded as part of a DOM document, and thus acts on its containing document, goquery doesn't know which HTML document to act upon. So it needs to be told, and that's what the `Document` type is for. It holds the root document node as the initial Selection value to manipulate.
|
||||||
|
|
||||||
|
jQuery often has many variants for the same function (no argument, a selector string argument, a jQuery object argument, a DOM element argument, ...). Instead of exposing the same features in goquery as a single method with variadic empty interface arguments, statically-typed signatures are used following this naming convention:
|
||||||
|
|
||||||
|
* When the jQuery equivalent can be called with no argument, it has the same name as jQuery for the no argument signature (e.g.: `Prev()`), and the version with a selector string argument is called `XxxFiltered()` (e.g.: `PrevFiltered()`)
|
||||||
|
* When the jQuery equivalent **requires** one argument, the same name as jQuery is used for the selector string version (e.g.: `Is()`)
|
||||||
|
* The signatures accepting a jQuery object as argument are defined in goquery as `XxxSelection()` and take a `*Selection` object as argument (e.g.: `FilterSelection()`)
|
||||||
|
* The signatures accepting a DOM element as argument in jQuery are defined in goquery as `XxxNodes()` and take a variadic argument of type `*html.Node` (e.g.: `FilterNodes()`)
|
||||||
|
* The signatures accepting a function as argument in jQuery are defined in goquery as `XxxFunction()` and take a function as argument (e.g.: `FilterFunction()`)
|
||||||
|
* The goquery methods that can be called with a selector string have a corresponding version that take a `Matcher` interface and are defined as `XxxMatcher()` (e.g.: `IsMatcher()`)
|
||||||
|
|
||||||
|
Utility functions that are not in jQuery but are useful in Go are implemented as functions (that take a `*Selection` as parameter), to avoid a potential naming clash on the `*Selection`'s methods (reserved for jQuery-equivalent behaviour).
|
||||||
|
|
||||||
|
The complete [godoc reference documentation can be found here][doc].
|
||||||
|
|
||||||
|
Please note that Cascadia's selectors do not necessarily match all supported selectors of jQuery (Sizzle). See the [cascadia project][cascadia] for details. Invalid selector strings compile to a `Matcher` that fails to match any node. Behaviour of the various functions that take a selector string as argument follows from that fact, e.g. (where `~` is an invalid selector string):
|
||||||
|
|
||||||
|
* `Find("~")` returns an empty selection because the selector string doesn't match anything.
|
||||||
|
* `Add("~")` returns a new selection that holds the same nodes as the original selection, because it didn't add any node (selector string didn't match anything).
|
||||||
|
* `ParentsFiltered("~")` returns an empty selection because the selector string doesn't match anything.
|
||||||
|
* `ParentsUntil("~")` returns all parents of the selection because the selector string didn't match any element to stop before the top element.
|
||||||
|
|
||||||
|
## Examples
|
||||||
|
|
||||||
|
See some tips and tricks in the [wiki][].
|
||||||
|
|
||||||
|
Adapted from example_test.go:
|
||||||
|
|
||||||
|
```Go
|
||||||
|
package main
|
||||||
|
|
||||||
|
import (
|
||||||
|
"fmt"
|
||||||
|
"log"
|
||||||
|
"net/http"
|
||||||
|
|
||||||
|
"github.com/PuerkitoBio/goquery"
|
||||||
|
)
|
||||||
|
|
||||||
|
func ExampleScrape() {
|
||||||
|
// Request the HTML page.
|
||||||
|
res, err := http.Get("http://metalsucks.net")
|
||||||
|
if err != nil {
|
||||||
|
log.Fatal(err)
|
||||||
|
}
|
||||||
|
defer res.Body.Close()
|
||||||
|
if res.StatusCode != 200 {
|
||||||
|
log.Fatalf("status code error: %d %s", res.StatusCode, res.Status)
|
||||||
|
}
|
||||||
|
|
||||||
|
// Load the HTML document
|
||||||
|
doc, err := goquery.NewDocumentFromReader(res.Body)
|
||||||
|
if err != nil {
|
||||||
|
log.Fatal(err)
|
||||||
|
}
|
||||||
|
|
||||||
|
// Find the review items
|
||||||
|
doc.Find(".sidebar-reviews article .content-block").Each(func(i int, s *goquery.Selection) {
|
||||||
|
// For each item found, get the band and title
|
||||||
|
band := s.Find("a").Text()
|
||||||
|
title := s.Find("i").Text()
|
||||||
|
fmt.Printf("Review %d: %s - %s\n", i, band, title)
|
||||||
|
})
|
||||||
|
}
|
||||||
|
|
||||||
|
func main() {
|
||||||
|
ExampleScrape()
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
## Related Projects
|
||||||
|
|
||||||
|
- [Goq][goq], an HTML deserialization and scraping library based on goquery and struct tags.
|
||||||
|
- [andybalholm/cascadia][cascadia], the CSS selector library used by goquery.
|
||||||
|
- [suntong/cascadia][cascadiacli], a command-line interface to the cascadia CSS selector library, useful to test selectors.
|
||||||
|
- [asciimoo/colly](https://github.com/asciimoo/colly), a lightning fast and elegant Scraping Framework
|
||||||
|
- [gnulnx/goperf](https://github.com/gnulnx/goperf), a website performance test tool that also fetches static assets.
|
||||||
|
- [MontFerret/ferret](https://github.com/MontFerret/ferret), declarative web scraping.
|
||||||
|
- [tacusci/berrycms](https://github.com/tacusci/berrycms), a modern simple to use CMS with easy to write plugins
|
||||||
|
- [Dataflow kit](https://github.com/slotix/dataflowkit), Web Scraping framework for Gophers.
|
||||||
|
|
||||||
|
## Support
|
||||||
|
|
||||||
|
There are a number of ways you can support the project:
|
||||||
|
|
||||||
|
* Use it, star it, build something with it, spread the word!
|
||||||
|
- If you do build something open-source or otherwise publicly-visible, let me know so I can add it to the [Related Projects](#related-projects) section!
|
||||||
|
* Raise issues to improve the project (note: doc typos and clarifications are issues too!)
|
||||||
|
- Please search existing issues before opening a new one - it may have already been adressed.
|
||||||
|
* Pull requests: please discuss new code in an issue first, unless the fix is really trivial.
|
||||||
|
- Make sure new code is tested.
|
||||||
|
- Be mindful of existing code - PRs that break existing code have a high probability of being declined, unless it fixes a serious issue.
|
||||||
|
|
||||||
|
If you desperately want to send money my way, I have a BuyMeACoffee.com page:
|
||||||
|
|
||||||
|
<a href="https://www.buymeacoffee.com/mna" target="_blank"><img src="https://www.buymeacoffee.com/assets/img/custom_images/orange_img.png" alt="Buy Me A Coffee" style="height: 41px !important;width: 174px !important;box-shadow: 0px 3px 2px 0px rgba(190, 190, 190, 0.5) !important;-webkit-box-shadow: 0px 3px 2px 0px rgba(190, 190, 190, 0.5) !important;" ></a>
|
||||||
|
|
||||||
|
## License
|
||||||
|
|
||||||
|
The [BSD 3-Clause license][bsd], the same as the [Go language][golic]. Cascadia's license is [here][caslic].
|
||||||
|
|
||||||
|
[jquery]: http://jquery.com/
|
||||||
|
[go]: http://golang.org/
|
||||||
|
[cascadia]: https://github.com/andybalholm/cascadia
|
||||||
|
[cascadiacli]: https://github.com/suntong/cascadia
|
||||||
|
[bsd]: http://opensource.org/licenses/BSD-3-Clause
|
||||||
|
[golic]: http://golang.org/LICENSE
|
||||||
|
[caslic]: https://github.com/andybalholm/cascadia/blob/master/LICENSE
|
||||||
|
[doc]: http://godoc.org/github.com/PuerkitoBio/goquery
|
||||||
|
[index]: http://api.jquery.com/index/
|
||||||
|
[gonet]: https://github.com/golang/net/
|
||||||
|
[html]: http://godoc.org/golang.org/x/net/html
|
||||||
|
[wiki]: https://github.com/PuerkitoBio/goquery/wiki/Tips-and-tricks
|
||||||
|
[thatguystone]: https://github.com/thatguystone
|
||||||
|
[piotr]: https://github.com/piotrkowalczuk
|
||||||
|
[goq]: https://github.com/andrewstuart/goq
|
124
vendor/github.com/PuerkitoBio/goquery/array.go
generated
vendored
Normal file
124
vendor/github.com/PuerkitoBio/goquery/array.go
generated
vendored
Normal file
|
@ -0,0 +1,124 @@
|
||||||
|
package goquery
|
||||||
|
|
||||||
|
import (
|
||||||
|
"golang.org/x/net/html"
|
||||||
|
)
|
||||||
|
|
||||||
|
const (
|
||||||
|
maxUint = ^uint(0)
|
||||||
|
maxInt = int(maxUint >> 1)
|
||||||
|
|
||||||
|
// ToEnd is a special index value that can be used as end index in a call
|
||||||
|
// to Slice so that all elements are selected until the end of the Selection.
|
||||||
|
// It is equivalent to passing (*Selection).Length().
|
||||||
|
ToEnd = maxInt
|
||||||
|
)
|
||||||
|
|
||||||
|
// First reduces the set of matched elements to the first in the set.
|
||||||
|
// It returns a new Selection object, and an empty Selection object if the
|
||||||
|
// the selection is empty.
|
||||||
|
func (s *Selection) First() *Selection {
|
||||||
|
return s.Eq(0)
|
||||||
|
}
|
||||||
|
|
||||||
|
// Last reduces the set of matched elements to the last in the set.
|
||||||
|
// It returns a new Selection object, and an empty Selection object if
|
||||||
|
// the selection is empty.
|
||||||
|
func (s *Selection) Last() *Selection {
|
||||||
|
return s.Eq(-1)
|
||||||
|
}
|
||||||
|
|
||||||
|
// Eq reduces the set of matched elements to the one at the specified index.
|
||||||
|
// If a negative index is given, it counts backwards starting at the end of the
|
||||||
|
// set. It returns a new Selection object, and an empty Selection object if the
|
||||||
|
// index is invalid.
|
||||||
|
func (s *Selection) Eq(index int) *Selection {
|
||||||
|
if index < 0 {
|
||||||
|
index += len(s.Nodes)
|
||||||
|
}
|
||||||
|
|
||||||
|
if index >= len(s.Nodes) || index < 0 {
|
||||||
|
return newEmptySelection(s.document)
|
||||||
|
}
|
||||||
|
|
||||||
|
return s.Slice(index, index+1)
|
||||||
|
}
|
||||||
|
|
||||||
|
// Slice reduces the set of matched elements to a subset specified by a range
|
||||||
|
// of indices. The start index is 0-based and indicates the index of the first
|
||||||
|
// element to select. The end index is 0-based and indicates the index at which
|
||||||
|
// the elements stop being selected (the end index is not selected).
|
||||||
|
//
|
||||||
|
// The indices may be negative, in which case they represent an offset from the
|
||||||
|
// end of the selection.
|
||||||
|
//
|
||||||
|
// The special value ToEnd may be specified as end index, in which case all elements
|
||||||
|
// until the end are selected. This works both for a positive and negative start
|
||||||
|
// index.
|
||||||
|
func (s *Selection) Slice(start, end int) *Selection {
|
||||||
|
if start < 0 {
|
||||||
|
start += len(s.Nodes)
|
||||||
|
}
|
||||||
|
if end == ToEnd {
|
||||||
|
end = len(s.Nodes)
|
||||||
|
} else if end < 0 {
|
||||||
|
end += len(s.Nodes)
|
||||||
|
}
|
||||||
|
return pushStack(s, s.Nodes[start:end])
|
||||||
|
}
|
||||||
|
|
||||||
|
// Get retrieves the underlying node at the specified index.
|
||||||
|
// Get without parameter is not implemented, since the node array is available
|
||||||
|
// on the Selection object.
|
||||||
|
func (s *Selection) Get(index int) *html.Node {
|
||||||
|
if index < 0 {
|
||||||
|
index += len(s.Nodes) // Negative index gets from the end
|
||||||
|
}
|
||||||
|
return s.Nodes[index]
|
||||||
|
}
|
||||||
|
|
||||||
|
// Index returns the position of the first element within the Selection object
|
||||||
|
// relative to its sibling elements.
|
||||||
|
func (s *Selection) Index() int {
|
||||||
|
if len(s.Nodes) > 0 {
|
||||||
|
return newSingleSelection(s.Nodes[0], s.document).PrevAll().Length()
|
||||||
|
}
|
||||||
|
return -1
|
||||||
|
}
|
||||||
|
|
||||||
|
// IndexSelector returns the position of the first element within the
|
||||||
|
// Selection object relative to the elements matched by the selector, or -1 if
|
||||||
|
// not found.
|
||||||
|
func (s *Selection) IndexSelector(selector string) int {
|
||||||
|
if len(s.Nodes) > 0 {
|
||||||
|
sel := s.document.Find(selector)
|
||||||
|
return indexInSlice(sel.Nodes, s.Nodes[0])
|
||||||
|
}
|
||||||
|
return -1
|
||||||
|
}
|
||||||
|
|
||||||
|
// IndexMatcher returns the position of the first element within the
|
||||||
|
// Selection object relative to the elements matched by the matcher, or -1 if
|
||||||
|
// not found.
|
||||||
|
func (s *Selection) IndexMatcher(m Matcher) int {
|
||||||
|
if len(s.Nodes) > 0 {
|
||||||
|
sel := s.document.FindMatcher(m)
|
||||||
|
return indexInSlice(sel.Nodes, s.Nodes[0])
|
||||||
|
}
|
||||||
|
return -1
|
||||||
|
}
|
||||||
|
|
||||||
|
// IndexOfNode returns the position of the specified node within the Selection
|
||||||
|
// object, or -1 if not found.
|
||||||
|
func (s *Selection) IndexOfNode(node *html.Node) int {
|
||||||
|
return indexInSlice(s.Nodes, node)
|
||||||
|
}
|
||||||
|
|
||||||
|
// IndexOfSelection returns the position of the first node in the specified
|
||||||
|
// Selection object within this Selection object, or -1 if not found.
|
||||||
|
func (s *Selection) IndexOfSelection(sel *Selection) int {
|
||||||
|
if sel != nil && len(sel.Nodes) > 0 {
|
||||||
|
return indexInSlice(s.Nodes, sel.Nodes[0])
|
||||||
|
}
|
||||||
|
return -1
|
||||||
|
}
|
123
vendor/github.com/PuerkitoBio/goquery/doc.go
generated
vendored
Normal file
123
vendor/github.com/PuerkitoBio/goquery/doc.go
generated
vendored
Normal file
|
@ -0,0 +1,123 @@
|
||||||
|
// Copyright (c) 2012-2016, Martin Angers & Contributors
|
||||||
|
// All rights reserved.
|
||||||
|
//
|
||||||
|
// Redistribution and use in source and binary forms, with or without modification,
|
||||||
|
// are permitted provided that the following conditions are met:
|
||||||
|
//
|
||||||
|
// * Redistributions of source code must retain the above copyright notice,
|
||||||
|
// this list of conditions and the following disclaimer.
|
||||||
|
// * Redistributions in binary form must reproduce the above copyright notice,
|
||||||
|
// this list of conditions and the following disclaimer in the documentation and/or
|
||||||
|
// other materials provided with the distribution.
|
||||||
|
// * Neither the name of the author nor the names of its contributors may be used to
|
||||||
|
// endorse or promote products derived from this software without specific prior written permission.
|
||||||
|
//
|
||||||
|
// THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS" AND ANY EXPRESS
|
||||||
|
// OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY
|
||||||
|
// AND FITNESS FOR A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT HOLDER OR
|
||||||
|
// CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL
|
||||||
|
// DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE,
|
||||||
|
// DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY,
|
||||||
|
// WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY
|
||||||
|
// WAY OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
|
||||||
|
|
||||||
|
/*
|
||||||
|
Package goquery implements features similar to jQuery, including the chainable
|
||||||
|
syntax, to manipulate and query an HTML document.
|
||||||
|
|
||||||
|
It brings a syntax and a set of features similar to jQuery to the Go language.
|
||||||
|
It is based on Go's net/html package and the CSS Selector library cascadia.
|
||||||
|
Since the net/html parser returns nodes, and not a full-featured DOM
|
||||||
|
tree, jQuery's stateful manipulation functions (like height(), css(), detach())
|
||||||
|
have been left off.
|
||||||
|
|
||||||
|
Also, because the net/html parser requires UTF-8 encoding, so does goquery: it is
|
||||||
|
the caller's responsibility to ensure that the source document provides UTF-8 encoded HTML.
|
||||||
|
See the repository's wiki for various options on how to do this.
|
||||||
|
|
||||||
|
Syntax-wise, it is as close as possible to jQuery, with the same method names when
|
||||||
|
possible, and that warm and fuzzy chainable interface. jQuery being the
|
||||||
|
ultra-popular library that it is, writing a similar HTML-manipulating
|
||||||
|
library was better to follow its API than to start anew (in the same spirit as
|
||||||
|
Go's fmt package), even though some of its methods are less than intuitive (looking
|
||||||
|
at you, index()...).
|
||||||
|
|
||||||
|
It is hosted on GitHub, along with additional documentation in the README.md
|
||||||
|
file: https://github.com/puerkitobio/goquery
|
||||||
|
|
||||||
|
Please note that because of the net/html dependency, goquery requires Go1.1+.
|
||||||
|
|
||||||
|
The various methods are split into files based on the category of behavior.
|
||||||
|
The three dots (...) indicate that various "overloads" are available.
|
||||||
|
|
||||||
|
* array.go : array-like positional manipulation of the selection.
|
||||||
|
- Eq()
|
||||||
|
- First()
|
||||||
|
- Get()
|
||||||
|
- Index...()
|
||||||
|
- Last()
|
||||||
|
- Slice()
|
||||||
|
|
||||||
|
* expand.go : methods that expand or augment the selection's set.
|
||||||
|
- Add...()
|
||||||
|
- AndSelf()
|
||||||
|
- Union(), which is an alias for AddSelection()
|
||||||
|
|
||||||
|
* filter.go : filtering methods, that reduce the selection's set.
|
||||||
|
- End()
|
||||||
|
- Filter...()
|
||||||
|
- Has...()
|
||||||
|
- Intersection(), which is an alias of FilterSelection()
|
||||||
|
- Not...()
|
||||||
|
|
||||||
|
* iteration.go : methods to loop over the selection's nodes.
|
||||||
|
- Each()
|
||||||
|
- EachWithBreak()
|
||||||
|
- Map()
|
||||||
|
|
||||||
|
* manipulation.go : methods for modifying the document
|
||||||
|
- After...()
|
||||||
|
- Append...()
|
||||||
|
- Before...()
|
||||||
|
- Clone()
|
||||||
|
- Empty()
|
||||||
|
- Prepend...()
|
||||||
|
- Remove...()
|
||||||
|
- ReplaceWith...()
|
||||||
|
- Unwrap()
|
||||||
|
- Wrap...()
|
||||||
|
- WrapAll...()
|
||||||
|
- WrapInner...()
|
||||||
|
|
||||||
|
* property.go : methods that inspect and get the node's properties values.
|
||||||
|
- Attr*(), RemoveAttr(), SetAttr()
|
||||||
|
- AddClass(), HasClass(), RemoveClass(), ToggleClass()
|
||||||
|
- Html()
|
||||||
|
- Length()
|
||||||
|
- Size(), which is an alias for Length()
|
||||||
|
- Text()
|
||||||
|
|
||||||
|
* query.go : methods that query, or reflect, a node's identity.
|
||||||
|
- Contains()
|
||||||
|
- Is...()
|
||||||
|
|
||||||
|
* traversal.go : methods to traverse the HTML document tree.
|
||||||
|
- Children...()
|
||||||
|
- Contents()
|
||||||
|
- Find...()
|
||||||
|
- Next...()
|
||||||
|
- Parent[s]...()
|
||||||
|
- Prev...()
|
||||||
|
- Siblings...()
|
||||||
|
|
||||||
|
* type.go : definition of the types exposed by goquery.
|
||||||
|
- Document
|
||||||
|
- Selection
|
||||||
|
- Matcher
|
||||||
|
|
||||||
|
* utilities.go : definition of helper functions (and not methods on a *Selection)
|
||||||
|
that are not part of jQuery, but are useful to goquery.
|
||||||
|
- NodeName
|
||||||
|
- OuterHtml
|
||||||
|
*/
|
||||||
|
package goquery
|
70
vendor/github.com/PuerkitoBio/goquery/expand.go
generated
vendored
Normal file
70
vendor/github.com/PuerkitoBio/goquery/expand.go
generated
vendored
Normal file
|
@ -0,0 +1,70 @@
|
||||||
|
package goquery
|
||||||
|
|
||||||
|
import "golang.org/x/net/html"
|
||||||
|
|
||||||
|
// Add adds the selector string's matching nodes to those in the current
|
||||||
|
// selection and returns a new Selection object.
|
||||||
|
// The selector string is run in the context of the document of the current
|
||||||
|
// Selection object.
|
||||||
|
func (s *Selection) Add(selector string) *Selection {
|
||||||
|
return s.AddNodes(findWithMatcher([]*html.Node{s.document.rootNode}, compileMatcher(selector))...)
|
||||||
|
}
|
||||||
|
|
||||||
|
// AddMatcher adds the matcher's matching nodes to those in the current
|
||||||
|
// selection and returns a new Selection object.
|
||||||
|
// The matcher is run in the context of the document of the current
|
||||||
|
// Selection object.
|
||||||
|
func (s *Selection) AddMatcher(m Matcher) *Selection {
|
||||||
|
return s.AddNodes(findWithMatcher([]*html.Node{s.document.rootNode}, m)...)
|
||||||
|
}
|
||||||
|
|
||||||
|
// AddSelection adds the specified Selection object's nodes to those in the
|
||||||
|
// current selection and returns a new Selection object.
|
||||||
|
func (s *Selection) AddSelection(sel *Selection) *Selection {
|
||||||
|
if sel == nil {
|
||||||
|
return s.AddNodes()
|
||||||
|
}
|
||||||
|
return s.AddNodes(sel.Nodes...)
|
||||||
|
}
|
||||||
|
|
||||||
|
// Union is an alias for AddSelection.
|
||||||
|
func (s *Selection) Union(sel *Selection) *Selection {
|
||||||
|
return s.AddSelection(sel)
|
||||||
|
}
|
||||||
|
|
||||||
|
// AddNodes adds the specified nodes to those in the
|
||||||
|
// current selection and returns a new Selection object.
|
||||||
|
func (s *Selection) AddNodes(nodes ...*html.Node) *Selection {
|
||||||
|
return pushStack(s, appendWithoutDuplicates(s.Nodes, nodes, nil))
|
||||||
|
}
|
||||||
|
|
||||||
|
// AndSelf adds the previous set of elements on the stack to the current set.
|
||||||
|
// It returns a new Selection object containing the current Selection combined
|
||||||
|
// with the previous one.
|
||||||
|
// Deprecated: This function has been deprecated and is now an alias for AddBack().
|
||||||
|
func (s *Selection) AndSelf() *Selection {
|
||||||
|
return s.AddBack()
|
||||||
|
}
|
||||||
|
|
||||||
|
// AddBack adds the previous set of elements on the stack to the current set.
|
||||||
|
// It returns a new Selection object containing the current Selection combined
|
||||||
|
// with the previous one.
|
||||||
|
func (s *Selection) AddBack() *Selection {
|
||||||
|
return s.AddSelection(s.prevSel)
|
||||||
|
}
|
||||||
|
|
||||||
|
// AddBackFiltered reduces the previous set of elements on the stack to those that
|
||||||
|
// match the selector string, and adds them to the current set.
|
||||||
|
// It returns a new Selection object containing the current Selection combined
|
||||||
|
// with the filtered previous one
|
||||||
|
func (s *Selection) AddBackFiltered(selector string) *Selection {
|
||||||
|
return s.AddSelection(s.prevSel.Filter(selector))
|
||||||
|
}
|
||||||
|
|
||||||
|
// AddBackMatcher reduces the previous set of elements on the stack to those that match
|
||||||
|
// the mateher, and adds them to the curernt set.
|
||||||
|
// It returns a new Selection object containing the current Selection combined
|
||||||
|
// with the filtered previous one
|
||||||
|
func (s *Selection) AddBackMatcher(m Matcher) *Selection {
|
||||||
|
return s.AddSelection(s.prevSel.FilterMatcher(m))
|
||||||
|
}
|
163
vendor/github.com/PuerkitoBio/goquery/filter.go
generated
vendored
Normal file
163
vendor/github.com/PuerkitoBio/goquery/filter.go
generated
vendored
Normal file
|
@ -0,0 +1,163 @@
|
||||||
|
package goquery
|
||||||
|
|
||||||
|
import "golang.org/x/net/html"
|
||||||
|
|
||||||
|
// Filter reduces the set of matched elements to those that match the selector string.
|
||||||
|
// It returns a new Selection object for this subset of matching elements.
|
||||||
|
func (s *Selection) Filter(selector string) *Selection {
|
||||||
|
return s.FilterMatcher(compileMatcher(selector))
|
||||||
|
}
|
||||||
|
|
||||||
|
// FilterMatcher reduces the set of matched elements to those that match
|
||||||
|
// the given matcher. It returns a new Selection object for this subset
|
||||||
|
// of matching elements.
|
||||||
|
func (s *Selection) FilterMatcher(m Matcher) *Selection {
|
||||||
|
return pushStack(s, winnow(s, m, true))
|
||||||
|
}
|
||||||
|
|
||||||
|
// Not removes elements from the Selection that match the selector string.
|
||||||
|
// It returns a new Selection object with the matching elements removed.
|
||||||
|
func (s *Selection) Not(selector string) *Selection {
|
||||||
|
return s.NotMatcher(compileMatcher(selector))
|
||||||
|
}
|
||||||
|
|
||||||
|
// NotMatcher removes elements from the Selection that match the given matcher.
|
||||||
|
// It returns a new Selection object with the matching elements removed.
|
||||||
|
func (s *Selection) NotMatcher(m Matcher) *Selection {
|
||||||
|
return pushStack(s, winnow(s, m, false))
|
||||||
|
}
|
||||||
|
|
||||||
|
// FilterFunction reduces the set of matched elements to those that pass the function's test.
|
||||||
|
// It returns a new Selection object for this subset of elements.
|
||||||
|
func (s *Selection) FilterFunction(f func(int, *Selection) bool) *Selection {
|
||||||
|
return pushStack(s, winnowFunction(s, f, true))
|
||||||
|
}
|
||||||
|
|
||||||
|
// NotFunction removes elements from the Selection that pass the function's test.
|
||||||
|
// It returns a new Selection object with the matching elements removed.
|
||||||
|
func (s *Selection) NotFunction(f func(int, *Selection) bool) *Selection {
|
||||||
|
return pushStack(s, winnowFunction(s, f, false))
|
||||||
|
}
|
||||||
|
|
||||||
|
// FilterNodes reduces the set of matched elements to those that match the specified nodes.
|
||||||
|
// It returns a new Selection object for this subset of elements.
|
||||||
|
func (s *Selection) FilterNodes(nodes ...*html.Node) *Selection {
|
||||||
|
return pushStack(s, winnowNodes(s, nodes, true))
|
||||||
|
}
|
||||||
|
|
||||||
|
// NotNodes removes elements from the Selection that match the specified nodes.
|
||||||
|
// It returns a new Selection object with the matching elements removed.
|
||||||
|
func (s *Selection) NotNodes(nodes ...*html.Node) *Selection {
|
||||||
|
return pushStack(s, winnowNodes(s, nodes, false))
|
||||||
|
}
|
||||||
|
|
||||||
|
// FilterSelection reduces the set of matched elements to those that match a
|
||||||
|
// node in the specified Selection object.
|
||||||
|
// It returns a new Selection object for this subset of elements.
|
||||||
|
func (s *Selection) FilterSelection(sel *Selection) *Selection {
|
||||||
|
if sel == nil {
|
||||||
|
return pushStack(s, winnowNodes(s, nil, true))
|
||||||
|
}
|
||||||
|
return pushStack(s, winnowNodes(s, sel.Nodes, true))
|
||||||
|
}
|
||||||
|
|
||||||
|
// NotSelection removes elements from the Selection that match a node in the specified
|
||||||
|
// Selection object. It returns a new Selection object with the matching elements removed.
|
||||||
|
func (s *Selection) NotSelection(sel *Selection) *Selection {
|
||||||
|
if sel == nil {
|
||||||
|
return pushStack(s, winnowNodes(s, nil, false))
|
||||||
|
}
|
||||||
|
return pushStack(s, winnowNodes(s, sel.Nodes, false))
|
||||||
|
}
|
||||||
|
|
||||||
|
// Intersection is an alias for FilterSelection.
|
||||||
|
func (s *Selection) Intersection(sel *Selection) *Selection {
|
||||||
|
return s.FilterSelection(sel)
|
||||||
|
}
|
||||||
|
|
||||||
|
// Has reduces the set of matched elements to those that have a descendant
|
||||||
|
// that matches the selector.
|
||||||
|
// It returns a new Selection object with the matching elements.
|
||||||
|
func (s *Selection) Has(selector string) *Selection {
|
||||||
|
return s.HasSelection(s.document.Find(selector))
|
||||||
|
}
|
||||||
|
|
||||||
|
// HasMatcher reduces the set of matched elements to those that have a descendant
|
||||||
|
// that matches the matcher.
|
||||||
|
// It returns a new Selection object with the matching elements.
|
||||||
|
func (s *Selection) HasMatcher(m Matcher) *Selection {
|
||||||
|
return s.HasSelection(s.document.FindMatcher(m))
|
||||||
|
}
|
||||||
|
|
||||||
|
// HasNodes reduces the set of matched elements to those that have a
|
||||||
|
// descendant that matches one of the nodes.
|
||||||
|
// It returns a new Selection object with the matching elements.
|
||||||
|
func (s *Selection) HasNodes(nodes ...*html.Node) *Selection {
|
||||||
|
return s.FilterFunction(func(_ int, sel *Selection) bool {
|
||||||
|
// Add all nodes that contain one of the specified nodes
|
||||||
|
for _, n := range nodes {
|
||||||
|
if sel.Contains(n) {
|
||||||
|
return true
|
||||||
|
}
|
||||||
|
}
|
||||||
|
return false
|
||||||
|
})
|
||||||
|
}
|
||||||
|
|
||||||
|
// HasSelection reduces the set of matched elements to those that have a
|
||||||
|
// descendant that matches one of the nodes of the specified Selection object.
|
||||||
|
// It returns a new Selection object with the matching elements.
|
||||||
|
func (s *Selection) HasSelection(sel *Selection) *Selection {
|
||||||
|
if sel == nil {
|
||||||
|
return s.HasNodes()
|
||||||
|
}
|
||||||
|
return s.HasNodes(sel.Nodes...)
|
||||||
|
}
|
||||||
|
|
||||||
|
// End ends the most recent filtering operation in the current chain and
|
||||||
|
// returns the set of matched elements to its previous state.
|
||||||
|
func (s *Selection) End() *Selection {
|
||||||
|
if s.prevSel != nil {
|
||||||
|
return s.prevSel
|
||||||
|
}
|
||||||
|
return newEmptySelection(s.document)
|
||||||
|
}
|
||||||
|
|
||||||
|
// Filter based on the matcher, and the indicator to keep (Filter) or
|
||||||
|
// to get rid of (Not) the matching elements.
|
||||||
|
func winnow(sel *Selection, m Matcher, keep bool) []*html.Node {
|
||||||
|
// Optimize if keep is requested
|
||||||
|
if keep {
|
||||||
|
return m.Filter(sel.Nodes)
|
||||||
|
}
|
||||||
|
// Use grep
|
||||||
|
return grep(sel, func(i int, s *Selection) bool {
|
||||||
|
return !m.Match(s.Get(0))
|
||||||
|
})
|
||||||
|
}
|
||||||
|
|
||||||
|
// Filter based on an array of nodes, and the indicator to keep (Filter) or
|
||||||
|
// to get rid of (Not) the matching elements.
|
||||||
|
func winnowNodes(sel *Selection, nodes []*html.Node, keep bool) []*html.Node {
|
||||||
|
if len(nodes)+len(sel.Nodes) < minNodesForSet {
|
||||||
|
return grep(sel, func(i int, s *Selection) bool {
|
||||||
|
return isInSlice(nodes, s.Get(0)) == keep
|
||||||
|
})
|
||||||
|
}
|
||||||
|
|
||||||
|
set := make(map[*html.Node]bool)
|
||||||
|
for _, n := range nodes {
|
||||||
|
set[n] = true
|
||||||
|
}
|
||||||
|
return grep(sel, func(i int, s *Selection) bool {
|
||||||
|
return set[s.Get(0)] == keep
|
||||||
|
})
|
||||||
|
}
|
||||||
|
|
||||||
|
// Filter based on a function test, and the indicator to keep (Filter) or
|
||||||
|
// to get rid of (Not) the matching elements.
|
||||||
|
func winnowFunction(sel *Selection, f func(int, *Selection) bool, keep bool) []*html.Node {
|
||||||
|
return grep(sel, func(i int, s *Selection) bool {
|
||||||
|
return f(i, s) == keep
|
||||||
|
})
|
||||||
|
}
|
6
vendor/github.com/PuerkitoBio/goquery/go.mod
generated
vendored
Normal file
6
vendor/github.com/PuerkitoBio/goquery/go.mod
generated
vendored
Normal file
|
@ -0,0 +1,6 @@
|
||||||
|
module github.com/PuerkitoBio/goquery
|
||||||
|
|
||||||
|
require (
|
||||||
|
github.com/andybalholm/cascadia v1.0.0
|
||||||
|
golang.org/x/net v0.0.0-20181114220301-adae6a3d119a
|
||||||
|
)
|
5
vendor/github.com/PuerkitoBio/goquery/go.sum
generated
vendored
Normal file
5
vendor/github.com/PuerkitoBio/goquery/go.sum
generated
vendored
Normal file
|
@ -0,0 +1,5 @@
|
||||||
|
github.com/andybalholm/cascadia v1.0.0 h1:hOCXnnZ5A+3eVDX8pvgl4kofXv2ELss0bKcqRySc45o=
|
||||||
|
github.com/andybalholm/cascadia v1.0.0/go.mod h1:GsXiBklL0woXo1j/WYWtSYYC4ouU9PqHO0sqidkEA4Y=
|
||||||
|
golang.org/x/net v0.0.0-20180218175443-cbe0f9307d01/go.mod h1:mL1N/T3taQHkDXs73rZJwtUhF3w3ftmwwsq0BUmARs4=
|
||||||
|
golang.org/x/net v0.0.0-20181114220301-adae6a3d119a h1:gOpx8G595UYyvj8UK4+OFyY4rx037g3fmfhe5SasG3U=
|
||||||
|
golang.org/x/net v0.0.0-20181114220301-adae6a3d119a/go.mod h1:mL1N/T3taQHkDXs73rZJwtUhF3w3ftmwwsq0BUmARs4=
|
39
vendor/github.com/PuerkitoBio/goquery/iteration.go
generated
vendored
Normal file
39
vendor/github.com/PuerkitoBio/goquery/iteration.go
generated
vendored
Normal file
|
@ -0,0 +1,39 @@
|
||||||
|
package goquery
|
||||||
|
|
||||||
|
// Each iterates over a Selection object, executing a function for each
|
||||||
|
// matched element. It returns the current Selection object. The function
|
||||||
|
// f is called for each element in the selection with the index of the
|
||||||
|
// element in that selection starting at 0, and a *Selection that contains
|
||||||
|
// only that element.
|
||||||
|
func (s *Selection) Each(f func(int, *Selection)) *Selection {
|
||||||
|
for i, n := range s.Nodes {
|
||||||
|
f(i, newSingleSelection(n, s.document))
|
||||||
|
}
|
||||||
|
return s
|
||||||
|
}
|
||||||
|
|
||||||
|
// EachWithBreak iterates over a Selection object, executing a function for each
|
||||||
|
// matched element. It is identical to Each except that it is possible to break
|
||||||
|
// out of the loop by returning false in the callback function. It returns the
|
||||||
|
// current Selection object.
|
||||||
|
func (s *Selection) EachWithBreak(f func(int, *Selection) bool) *Selection {
|
||||||
|
for i, n := range s.Nodes {
|
||||||
|
if !f(i, newSingleSelection(n, s.document)) {
|
||||||
|
return s
|
||||||
|
}
|
||||||
|
}
|
||||||
|
return s
|
||||||
|
}
|
||||||
|
|
||||||
|
// Map passes each element in the current matched set through a function,
|
||||||
|
// producing a slice of string holding the returned values. The function
|
||||||
|
// f is called for each element in the selection with the index of the
|
||||||
|
// element in that selection starting at 0, and a *Selection that contains
|
||||||
|
// only that element.
|
||||||
|
func (s *Selection) Map(f func(int, *Selection) string) (result []string) {
|
||||||
|
for i, n := range s.Nodes {
|
||||||
|
result = append(result, f(i, newSingleSelection(n, s.document)))
|
||||||
|
}
|
||||||
|
|
||||||
|
return result
|
||||||
|
}
|
574
vendor/github.com/PuerkitoBio/goquery/manipulation.go
generated
vendored
Normal file
574
vendor/github.com/PuerkitoBio/goquery/manipulation.go
generated
vendored
Normal file
|
@ -0,0 +1,574 @@
|
||||||
|
package goquery
|
||||||
|
|
||||||
|
import (
|
||||||
|
"strings"
|
||||||
|
|
||||||
|
"golang.org/x/net/html"
|
||||||
|
)
|
||||||
|
|
||||||
|
// After applies the selector from the root document and inserts the matched elements
|
||||||
|
// after the elements in the set of matched elements.
|
||||||
|
//
|
||||||
|
// If one of the matched elements in the selection is not currently in the
|
||||||
|
// document, it's impossible to insert nodes after it, so it will be ignored.
|
||||||
|
//
|
||||||
|
// This follows the same rules as Selection.Append.
|
||||||
|
func (s *Selection) After(selector string) *Selection {
|
||||||
|
return s.AfterMatcher(compileMatcher(selector))
|
||||||
|
}
|
||||||
|
|
||||||
|
// AfterMatcher applies the matcher from the root document and inserts the matched elements
|
||||||
|
// after the elements in the set of matched elements.
|
||||||
|
//
|
||||||
|
// If one of the matched elements in the selection is not currently in the
|
||||||
|
// document, it's impossible to insert nodes after it, so it will be ignored.
|
||||||
|
//
|
||||||
|
// This follows the same rules as Selection.Append.
|
||||||
|
func (s *Selection) AfterMatcher(m Matcher) *Selection {
|
||||||
|
return s.AfterNodes(m.MatchAll(s.document.rootNode)...)
|
||||||
|
}
|
||||||
|
|
||||||
|
// AfterSelection inserts the elements in the selection after each element in the set of matched
|
||||||
|
// elements.
|
||||||
|
//
|
||||||
|
// This follows the same rules as Selection.Append.
|
||||||
|
func (s *Selection) AfterSelection(sel *Selection) *Selection {
|
||||||
|
return s.AfterNodes(sel.Nodes...)
|
||||||
|
}
|
||||||
|
|
||||||
|
// AfterHtml parses the html and inserts it after the set of matched elements.
|
||||||
|
//
|
||||||
|
// This follows the same rules as Selection.Append.
|
||||||
|
func (s *Selection) AfterHtml(html string) *Selection {
|
||||||
|
return s.AfterNodes(parseHtml(html)...)
|
||||||
|
}
|
||||||
|
|
||||||
|
// AfterNodes inserts the nodes after each element in the set of matched elements.
|
||||||
|
//
|
||||||
|
// This follows the same rules as Selection.Append.
|
||||||
|
func (s *Selection) AfterNodes(ns ...*html.Node) *Selection {
|
||||||
|
return s.manipulateNodes(ns, true, func(sn *html.Node, n *html.Node) {
|
||||||
|
if sn.Parent != nil {
|
||||||
|
sn.Parent.InsertBefore(n, sn.NextSibling)
|
||||||
|
}
|
||||||
|
})
|
||||||
|
}
|
||||||
|
|
||||||
|
// Append appends the elements specified by the selector to the end of each element
|
||||||
|
// in the set of matched elements, following those rules:
|
||||||
|
//
|
||||||
|
// 1) The selector is applied to the root document.
|
||||||
|
//
|
||||||
|
// 2) Elements that are part of the document will be moved to the new location.
|
||||||
|
//
|
||||||
|
// 3) If there are multiple locations to append to, cloned nodes will be
|
||||||
|
// appended to all target locations except the last one, which will be moved
|
||||||
|
// as noted in (2).
|
||||||
|
func (s *Selection) Append(selector string) *Selection {
|
||||||
|
return s.AppendMatcher(compileMatcher(selector))
|
||||||
|
}
|
||||||
|
|
||||||
|
// AppendMatcher appends the elements specified by the matcher to the end of each element
|
||||||
|
// in the set of matched elements.
|
||||||
|
//
|
||||||
|
// This follows the same rules as Selection.Append.
|
||||||
|
func (s *Selection) AppendMatcher(m Matcher) *Selection {
|
||||||
|
return s.AppendNodes(m.MatchAll(s.document.rootNode)...)
|
||||||
|
}
|
||||||
|
|
||||||
|
// AppendSelection appends the elements in the selection to the end of each element
|
||||||
|
// in the set of matched elements.
|
||||||
|
//
|
||||||
|
// This follows the same rules as Selection.Append.
|
||||||
|
func (s *Selection) AppendSelection(sel *Selection) *Selection {
|
||||||
|
return s.AppendNodes(sel.Nodes...)
|
||||||
|
}
|
||||||
|
|
||||||
|
// AppendHtml parses the html and appends it to the set of matched elements.
|
||||||
|
func (s *Selection) AppendHtml(html string) *Selection {
|
||||||
|
return s.AppendNodes(parseHtml(html)...)
|
||||||
|
}
|
||||||
|
|
||||||
|
// AppendNodes appends the specified nodes to each node in the set of matched elements.
|
||||||
|
//
|
||||||
|
// This follows the same rules as Selection.Append.
|
||||||
|
func (s *Selection) AppendNodes(ns ...*html.Node) *Selection {
|
||||||
|
return s.manipulateNodes(ns, false, func(sn *html.Node, n *html.Node) {
|
||||||
|
sn.AppendChild(n)
|
||||||
|
})
|
||||||
|
}
|
||||||
|
|
||||||
|
// Before inserts the matched elements before each element in the set of matched elements.
|
||||||
|
//
|
||||||
|
// This follows the same rules as Selection.Append.
|
||||||
|
func (s *Selection) Before(selector string) *Selection {
|
||||||
|
return s.BeforeMatcher(compileMatcher(selector))
|
||||||
|
}
|
||||||
|
|
||||||
|
// BeforeMatcher inserts the matched elements before each element in the set of matched elements.
|
||||||
|
//
|
||||||
|
// This follows the same rules as Selection.Append.
|
||||||
|
func (s *Selection) BeforeMatcher(m Matcher) *Selection {
|
||||||
|
return s.BeforeNodes(m.MatchAll(s.document.rootNode)...)
|
||||||
|
}
|
||||||
|
|
||||||
|
// BeforeSelection inserts the elements in the selection before each element in the set of matched
|
||||||
|
// elements.
|
||||||
|
//
|
||||||
|
// This follows the same rules as Selection.Append.
|
||||||
|
func (s *Selection) BeforeSelection(sel *Selection) *Selection {
|
||||||
|
return s.BeforeNodes(sel.Nodes...)
|
||||||
|
}
|
||||||
|
|
||||||
|
// BeforeHtml parses the html and inserts it before the set of matched elements.
|
||||||
|
//
|
||||||
|
// This follows the same rules as Selection.Append.
|
||||||
|
func (s *Selection) BeforeHtml(html string) *Selection {
|
||||||
|
return s.BeforeNodes(parseHtml(html)...)
|
||||||
|
}
|
||||||
|
|
||||||
|
// BeforeNodes inserts the nodes before each element in the set of matched elements.
|
||||||
|
//
|
||||||
|
// This follows the same rules as Selection.Append.
|
||||||
|
func (s *Selection) BeforeNodes(ns ...*html.Node) *Selection {
|
||||||
|
return s.manipulateNodes(ns, false, func(sn *html.Node, n *html.Node) {
|
||||||
|
if sn.Parent != nil {
|
||||||
|
sn.Parent.InsertBefore(n, sn)
|
||||||
|
}
|
||||||
|
})
|
||||||
|
}
|
||||||
|
|
||||||
|
// Clone creates a deep copy of the set of matched nodes. The new nodes will not be
|
||||||
|
// attached to the document.
|
||||||
|
func (s *Selection) Clone() *Selection {
|
||||||
|
ns := newEmptySelection(s.document)
|
||||||
|
ns.Nodes = cloneNodes(s.Nodes)
|
||||||
|
return ns
|
||||||
|
}
|
||||||
|
|
||||||
|
// Empty removes all children nodes from the set of matched elements.
|
||||||
|
// It returns the children nodes in a new Selection.
|
||||||
|
func (s *Selection) Empty() *Selection {
|
||||||
|
var nodes []*html.Node
|
||||||
|
|
||||||
|
for _, n := range s.Nodes {
|
||||||
|
for c := n.FirstChild; c != nil; c = n.FirstChild {
|
||||||
|
n.RemoveChild(c)
|
||||||
|
nodes = append(nodes, c)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
return pushStack(s, nodes)
|
||||||
|
}
|
||||||
|
|
||||||
|
// Prepend prepends the elements specified by the selector to each element in
|
||||||
|
// the set of matched elements, following the same rules as Append.
|
||||||
|
func (s *Selection) Prepend(selector string) *Selection {
|
||||||
|
return s.PrependMatcher(compileMatcher(selector))
|
||||||
|
}
|
||||||
|
|
||||||
|
// PrependMatcher prepends the elements specified by the matcher to each
|
||||||
|
// element in the set of matched elements.
|
||||||
|
//
|
||||||
|
// This follows the same rules as Selection.Append.
|
||||||
|
func (s *Selection) PrependMatcher(m Matcher) *Selection {
|
||||||
|
return s.PrependNodes(m.MatchAll(s.document.rootNode)...)
|
||||||
|
}
|
||||||
|
|
||||||
|
// PrependSelection prepends the elements in the selection to each element in
|
||||||
|
// the set of matched elements.
|
||||||
|
//
|
||||||
|
// This follows the same rules as Selection.Append.
|
||||||
|
func (s *Selection) PrependSelection(sel *Selection) *Selection {
|
||||||
|
return s.PrependNodes(sel.Nodes...)
|
||||||
|
}
|
||||||
|
|
||||||
|
// PrependHtml parses the html and prepends it to the set of matched elements.
|
||||||
|
func (s *Selection) PrependHtml(html string) *Selection {
|
||||||
|
return s.PrependNodes(parseHtml(html)...)
|
||||||
|
}
|
||||||
|
|
||||||
|
// PrependNodes prepends the specified nodes to each node in the set of
|
||||||
|
// matched elements.
|
||||||
|
//
|
||||||
|
// This follows the same rules as Selection.Append.
|
||||||
|
func (s *Selection) PrependNodes(ns ...*html.Node) *Selection {
|
||||||
|
return s.manipulateNodes(ns, true, func(sn *html.Node, n *html.Node) {
|
||||||
|
// sn.FirstChild may be nil, in which case this functions like
|
||||||
|
// sn.AppendChild()
|
||||||
|
sn.InsertBefore(n, sn.FirstChild)
|
||||||
|
})
|
||||||
|
}
|
||||||
|
|
||||||
|
// Remove removes the set of matched elements from the document.
|
||||||
|
// It returns the same selection, now consisting of nodes not in the document.
|
||||||
|
func (s *Selection) Remove() *Selection {
|
||||||
|
for _, n := range s.Nodes {
|
||||||
|
if n.Parent != nil {
|
||||||
|
n.Parent.RemoveChild(n)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
return s
|
||||||
|
}
|
||||||
|
|
||||||
|
// RemoveFiltered removes the set of matched elements by selector.
|
||||||
|
// It returns the Selection of removed nodes.
|
||||||
|
func (s *Selection) RemoveFiltered(selector string) *Selection {
|
||||||
|
return s.RemoveMatcher(compileMatcher(selector))
|
||||||
|
}
|
||||||
|
|
||||||
|
// RemoveMatcher removes the set of matched elements.
|
||||||
|
// It returns the Selection of removed nodes.
|
||||||
|
func (s *Selection) RemoveMatcher(m Matcher) *Selection {
|
||||||
|
return s.FilterMatcher(m).Remove()
|
||||||
|
}
|
||||||
|
|
||||||
|
// ReplaceWith replaces each element in the set of matched elements with the
|
||||||
|
// nodes matched by the given selector.
|
||||||
|
// It returns the removed elements.
|
||||||
|
//
|
||||||
|
// This follows the same rules as Selection.Append.
|
||||||
|
func (s *Selection) ReplaceWith(selector string) *Selection {
|
||||||
|
return s.ReplaceWithMatcher(compileMatcher(selector))
|
||||||
|
}
|
||||||
|
|
||||||
|
// ReplaceWithMatcher replaces each element in the set of matched elements with
|
||||||
|
// the nodes matched by the given Matcher.
|
||||||
|
// It returns the removed elements.
|
||||||
|
//
|
||||||
|
// This follows the same rules as Selection.Append.
|
||||||
|
func (s *Selection) ReplaceWithMatcher(m Matcher) *Selection {
|
||||||
|
return s.ReplaceWithNodes(m.MatchAll(s.document.rootNode)...)
|
||||||
|
}
|
||||||
|
|
||||||
|
// ReplaceWithSelection replaces each element in the set of matched elements with
|
||||||
|
// the nodes from the given Selection.
|
||||||
|
// It returns the removed elements.
|
||||||
|
//
|
||||||
|
// This follows the same rules as Selection.Append.
|
||||||
|
func (s *Selection) ReplaceWithSelection(sel *Selection) *Selection {
|
||||||
|
return s.ReplaceWithNodes(sel.Nodes...)
|
||||||
|
}
|
||||||
|
|
||||||
|
// ReplaceWithHtml replaces each element in the set of matched elements with
|
||||||
|
// the parsed HTML.
|
||||||
|
// It returns the removed elements.
|
||||||
|
//
|
||||||
|
// This follows the same rules as Selection.Append.
|
||||||
|
func (s *Selection) ReplaceWithHtml(html string) *Selection {
|
||||||
|
return s.ReplaceWithNodes(parseHtml(html)...)
|
||||||
|
}
|
||||||
|
|
||||||
|
// ReplaceWithNodes replaces each element in the set of matched elements with
|
||||||
|
// the given nodes.
|
||||||
|
// It returns the removed elements.
|
||||||
|
//
|
||||||
|
// This follows the same rules as Selection.Append.
|
||||||
|
func (s *Selection) ReplaceWithNodes(ns ...*html.Node) *Selection {
|
||||||
|
s.AfterNodes(ns...)
|
||||||
|
return s.Remove()
|
||||||
|
}
|
||||||
|
|
||||||
|
// SetHtml sets the html content of each element in the selection to
|
||||||
|
// specified html string.
|
||||||
|
func (s *Selection) SetHtml(html string) *Selection {
|
||||||
|
return setHtmlNodes(s, parseHtml(html)...)
|
||||||
|
}
|
||||||
|
|
||||||
|
// SetText sets the content of each element in the selection to specified content.
|
||||||
|
// The provided text string is escaped.
|
||||||
|
func (s *Selection) SetText(text string) *Selection {
|
||||||
|
return s.SetHtml(html.EscapeString(text))
|
||||||
|
}
|
||||||
|
|
||||||
|
// Unwrap removes the parents of the set of matched elements, leaving the matched
|
||||||
|
// elements (and their siblings, if any) in their place.
|
||||||
|
// It returns the original selection.
|
||||||
|
func (s *Selection) Unwrap() *Selection {
|
||||||
|
s.Parent().Each(func(i int, ss *Selection) {
|
||||||
|
// For some reason, jquery allows unwrap to remove the <head> element, so
|
||||||
|
// allowing it here too. Same for <html>. Why it allows those elements to
|
||||||
|
// be unwrapped while not allowing body is a mystery to me.
|
||||||
|
if ss.Nodes[0].Data != "body" {
|
||||||
|
ss.ReplaceWithSelection(ss.Contents())
|
||||||
|
}
|
||||||
|
})
|
||||||
|
|
||||||
|
return s
|
||||||
|
}
|
||||||
|
|
||||||
|
// Wrap wraps each element in the set of matched elements inside the first
|
||||||
|
// element matched by the given selector. The matched child is cloned before
|
||||||
|
// being inserted into the document.
|
||||||
|
//
|
||||||
|
// It returns the original set of elements.
|
||||||
|
func (s *Selection) Wrap(selector string) *Selection {
|
||||||
|
return s.WrapMatcher(compileMatcher(selector))
|
||||||
|
}
|
||||||
|
|
||||||
|
// WrapMatcher wraps each element in the set of matched elements inside the
|
||||||
|
// first element matched by the given matcher. The matched child is cloned
|
||||||
|
// before being inserted into the document.
|
||||||
|
//
|
||||||
|
// It returns the original set of elements.
|
||||||
|
func (s *Selection) WrapMatcher(m Matcher) *Selection {
|
||||||
|
return s.wrapNodes(m.MatchAll(s.document.rootNode)...)
|
||||||
|
}
|
||||||
|
|
||||||
|
// WrapSelection wraps each element in the set of matched elements inside the
|
||||||
|
// first element in the given Selection. The element is cloned before being
|
||||||
|
// inserted into the document.
|
||||||
|
//
|
||||||
|
// It returns the original set of elements.
|
||||||
|
func (s *Selection) WrapSelection(sel *Selection) *Selection {
|
||||||
|
return s.wrapNodes(sel.Nodes...)
|
||||||
|
}
|
||||||
|
|
||||||
|
// WrapHtml wraps each element in the set of matched elements inside the inner-
|
||||||
|
// most child of the given HTML.
|
||||||
|
//
|
||||||
|
// It returns the original set of elements.
|
||||||
|
func (s *Selection) WrapHtml(html string) *Selection {
|
||||||
|
return s.wrapNodes(parseHtml(html)...)
|
||||||
|
}
|
||||||
|
|
||||||
|
// WrapNode wraps each element in the set of matched elements inside the inner-
|
||||||
|
// most child of the given node. The given node is copied before being inserted
|
||||||
|
// into the document.
|
||||||
|
//
|
||||||
|
// It returns the original set of elements.
|
||||||
|
func (s *Selection) WrapNode(n *html.Node) *Selection {
|
||||||
|
return s.wrapNodes(n)
|
||||||
|
}
|
||||||
|
|
||||||
|
func (s *Selection) wrapNodes(ns ...*html.Node) *Selection {
|
||||||
|
s.Each(func(i int, ss *Selection) {
|
||||||
|
ss.wrapAllNodes(ns...)
|
||||||
|
})
|
||||||
|
|
||||||
|
return s
|
||||||
|
}
|
||||||
|
|
||||||
|
// WrapAll wraps a single HTML structure, matched by the given selector, around
|
||||||
|
// all elements in the set of matched elements. The matched child is cloned
|
||||||
|
// before being inserted into the document.
|
||||||
|
//
|
||||||
|
// It returns the original set of elements.
|
||||||
|
func (s *Selection) WrapAll(selector string) *Selection {
|
||||||
|
return s.WrapAllMatcher(compileMatcher(selector))
|
||||||
|
}
|
||||||
|
|
||||||
|
// WrapAllMatcher wraps a single HTML structure, matched by the given Matcher,
|
||||||
|
// around all elements in the set of matched elements. The matched child is
|
||||||
|
// cloned before being inserted into the document.
|
||||||
|
//
|
||||||
|
// It returns the original set of elements.
|
||||||
|
func (s *Selection) WrapAllMatcher(m Matcher) *Selection {
|
||||||
|
return s.wrapAllNodes(m.MatchAll(s.document.rootNode)...)
|
||||||
|
}
|
||||||
|
|
||||||
|
// WrapAllSelection wraps a single HTML structure, the first node of the given
|
||||||
|
// Selection, around all elements in the set of matched elements. The matched
|
||||||
|
// child is cloned before being inserted into the document.
|
||||||
|
//
|
||||||
|
// It returns the original set of elements.
|
||||||
|
func (s *Selection) WrapAllSelection(sel *Selection) *Selection {
|
||||||
|
return s.wrapAllNodes(sel.Nodes...)
|
||||||
|
}
|
||||||
|
|
||||||
|
// WrapAllHtml wraps the given HTML structure around all elements in the set of
|
||||||
|
// matched elements. The matched child is cloned before being inserted into the
|
||||||
|
// document.
|
||||||
|
//
|
||||||
|
// It returns the original set of elements.
|
||||||
|
func (s *Selection) WrapAllHtml(html string) *Selection {
|
||||||
|
return s.wrapAllNodes(parseHtml(html)...)
|
||||||
|
}
|
||||||
|
|
||||||
|
func (s *Selection) wrapAllNodes(ns ...*html.Node) *Selection {
|
||||||
|
if len(ns) > 0 {
|
||||||
|
return s.WrapAllNode(ns[0])
|
||||||
|
}
|
||||||
|
return s
|
||||||
|
}
|
||||||
|
|
||||||
|
// WrapAllNode wraps the given node around the first element in the Selection,
|
||||||
|
// making all other nodes in the Selection children of the given node. The node
|
||||||
|
// is cloned before being inserted into the document.
|
||||||
|
//
|
||||||
|
// It returns the original set of elements.
|
||||||
|
func (s *Selection) WrapAllNode(n *html.Node) *Selection {
|
||||||
|
if s.Size() == 0 {
|
||||||
|
return s
|
||||||
|
}
|
||||||
|
|
||||||
|
wrap := cloneNode(n)
|
||||||
|
|
||||||
|
first := s.Nodes[0]
|
||||||
|
if first.Parent != nil {
|
||||||
|
first.Parent.InsertBefore(wrap, first)
|
||||||
|
first.Parent.RemoveChild(first)
|
||||||
|
}
|
||||||
|
|
||||||
|
for c := getFirstChildEl(wrap); c != nil; c = getFirstChildEl(wrap) {
|
||||||
|
wrap = c
|
||||||
|
}
|
||||||
|
|
||||||
|
newSingleSelection(wrap, s.document).AppendSelection(s)
|
||||||
|
|
||||||
|
return s
|
||||||
|
}
|
||||||
|
|
||||||
|
// WrapInner wraps an HTML structure, matched by the given selector, around the
|
||||||
|
// content of element in the set of matched elements. The matched child is
|
||||||
|
// cloned before being inserted into the document.
|
||||||
|
//
|
||||||
|
// It returns the original set of elements.
|
||||||
|
func (s *Selection) WrapInner(selector string) *Selection {
|
||||||
|
return s.WrapInnerMatcher(compileMatcher(selector))
|
||||||
|
}
|
||||||
|
|
||||||
|
// WrapInnerMatcher wraps an HTML structure, matched by the given selector,
|
||||||
|
// around the content of element in the set of matched elements. The matched
|
||||||
|
// child is cloned before being inserted into the document.
|
||||||
|
//
|
||||||
|
// It returns the original set of elements.
|
||||||
|
func (s *Selection) WrapInnerMatcher(m Matcher) *Selection {
|
||||||
|
return s.wrapInnerNodes(m.MatchAll(s.document.rootNode)...)
|
||||||
|
}
|
||||||
|
|
||||||
|
// WrapInnerSelection wraps an HTML structure, matched by the given selector,
|
||||||
|
// around the content of element in the set of matched elements. The matched
|
||||||
|
// child is cloned before being inserted into the document.
|
||||||
|
//
|
||||||
|
// It returns the original set of elements.
|
||||||
|
func (s *Selection) WrapInnerSelection(sel *Selection) *Selection {
|
||||||
|
return s.wrapInnerNodes(sel.Nodes...)
|
||||||
|
}
|
||||||
|
|
||||||
|
// WrapInnerHtml wraps an HTML structure, matched by the given selector, around
|
||||||
|
// the content of element in the set of matched elements. The matched child is
|
||||||
|
// cloned before being inserted into the document.
|
||||||
|
//
|
||||||
|
// It returns the original set of elements.
|
||||||
|
func (s *Selection) WrapInnerHtml(html string) *Selection {
|
||||||
|
return s.wrapInnerNodes(parseHtml(html)...)
|
||||||
|
}
|
||||||
|
|
||||||
|
// WrapInnerNode wraps an HTML structure, matched by the given selector, around
|
||||||
|
// the content of element in the set of matched elements. The matched child is
|
||||||
|
// cloned before being inserted into the document.
|
||||||
|
//
|
||||||
|
// It returns the original set of elements.
|
||||||
|
func (s *Selection) WrapInnerNode(n *html.Node) *Selection {
|
||||||
|
return s.wrapInnerNodes(n)
|
||||||
|
}
|
||||||
|
|
||||||
|
func (s *Selection) wrapInnerNodes(ns ...*html.Node) *Selection {
|
||||||
|
if len(ns) == 0 {
|
||||||
|
return s
|
||||||
|
}
|
||||||
|
|
||||||
|
s.Each(func(i int, s *Selection) {
|
||||||
|
contents := s.Contents()
|
||||||
|
|
||||||
|
if contents.Size() > 0 {
|
||||||
|
contents.wrapAllNodes(ns...)
|
||||||
|
} else {
|
||||||
|
s.AppendNodes(cloneNode(ns[0]))
|
||||||
|
}
|
||||||
|
})
|
||||||
|
|
||||||
|
return s
|
||||||
|
}
|
||||||
|
|
||||||
|
func parseHtml(h string) []*html.Node {
|
||||||
|
// Errors are only returned when the io.Reader returns any error besides
|
||||||
|
// EOF, but strings.Reader never will
|
||||||
|
nodes, err := html.ParseFragment(strings.NewReader(h), &html.Node{Type: html.ElementNode})
|
||||||
|
if err != nil {
|
||||||
|
panic("goquery: failed to parse HTML: " + err.Error())
|
||||||
|
}
|
||||||
|
return nodes
|
||||||
|
}
|
||||||
|
|
||||||
|
func setHtmlNodes(s *Selection, ns ...*html.Node) *Selection {
|
||||||
|
for _, n := range s.Nodes {
|
||||||
|
for c := n.FirstChild; c != nil; c = n.FirstChild {
|
||||||
|
n.RemoveChild(c)
|
||||||
|
}
|
||||||
|
for _, c := range ns {
|
||||||
|
n.AppendChild(cloneNode(c))
|
||||||
|
}
|
||||||
|
}
|
||||||
|
return s
|
||||||
|
}
|
||||||
|
|
||||||
|
// Get the first child that is an ElementNode
|
||||||
|
func getFirstChildEl(n *html.Node) *html.Node {
|
||||||
|
c := n.FirstChild
|
||||||
|
for c != nil && c.Type != html.ElementNode {
|
||||||
|
c = c.NextSibling
|
||||||
|
}
|
||||||
|
return c
|
||||||
|
}
|
||||||
|
|
||||||
|
// Deep copy a slice of nodes.
|
||||||
|
func cloneNodes(ns []*html.Node) []*html.Node {
|
||||||
|
cns := make([]*html.Node, 0, len(ns))
|
||||||
|
|
||||||
|
for _, n := range ns {
|
||||||
|
cns = append(cns, cloneNode(n))
|
||||||
|
}
|
||||||
|
|
||||||
|
return cns
|
||||||
|
}
|
||||||
|
|
||||||
|
// Deep copy a node. The new node has clones of all the original node's
|
||||||
|
// children but none of its parents or siblings.
|
||||||
|
func cloneNode(n *html.Node) *html.Node {
|
||||||
|
nn := &html.Node{
|
||||||
|
Type: n.Type,
|
||||||
|
DataAtom: n.DataAtom,
|
||||||
|
Data: n.Data,
|
||||||
|
Attr: make([]html.Attribute, len(n.Attr)),
|
||||||
|
}
|
||||||
|
|
||||||
|
copy(nn.Attr, n.Attr)
|
||||||
|
for c := n.FirstChild; c != nil; c = c.NextSibling {
|
||||||
|
nn.AppendChild(cloneNode(c))
|
||||||
|
}
|
||||||
|
|
||||||
|
return nn
|
||||||
|
}
|
||||||
|
|
||||||
|
func (s *Selection) manipulateNodes(ns []*html.Node, reverse bool,
|
||||||
|
f func(sn *html.Node, n *html.Node)) *Selection {
|
||||||
|
|
||||||
|
lasti := s.Size() - 1
|
||||||
|
|
||||||
|
// net.Html doesn't provide document fragments for insertion, so to get
|
||||||
|
// things in the correct order with After() and Prepend(), the callback
|
||||||
|
// needs to be called on the reverse of the nodes.
|
||||||
|
if reverse {
|
||||||
|
for i, j := 0, len(ns)-1; i < j; i, j = i+1, j-1 {
|
||||||
|
ns[i], ns[j] = ns[j], ns[i]
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
for i, sn := range s.Nodes {
|
||||||
|
for _, n := range ns {
|
||||||
|
if i != lasti {
|
||||||
|
f(sn, cloneNode(n))
|
||||||
|
} else {
|
||||||
|
if n.Parent != nil {
|
||||||
|
n.Parent.RemoveChild(n)
|
||||||
|
}
|
||||||
|
f(sn, n)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
return s
|
||||||
|
}
|
275
vendor/github.com/PuerkitoBio/goquery/property.go
generated
vendored
Normal file
275
vendor/github.com/PuerkitoBio/goquery/property.go
generated
vendored
Normal file
|
@ -0,0 +1,275 @@
|
||||||
|
package goquery
|
||||||
|
|
||||||
|
import (
|
||||||
|
"bytes"
|
||||||
|
"regexp"
|
||||||
|
"strings"
|
||||||
|
|
||||||
|
"golang.org/x/net/html"
|
||||||
|
)
|
||||||
|
|
||||||
|
var rxClassTrim = regexp.MustCompile("[\t\r\n]")
|
||||||
|
|
||||||
|
// Attr gets the specified attribute's value for the first element in the
|
||||||
|
// Selection. To get the value for each element individually, use a looping
|
||||||
|
// construct such as Each or Map method.
|
||||||
|
func (s *Selection) Attr(attrName string) (val string, exists bool) {
|
||||||
|
if len(s.Nodes) == 0 {
|
||||||
|
return
|
||||||
|
}
|
||||||
|
return getAttributeValue(attrName, s.Nodes[0])
|
||||||
|
}
|
||||||
|
|
||||||
|
// AttrOr works like Attr but returns default value if attribute is not present.
|
||||||
|
func (s *Selection) AttrOr(attrName, defaultValue string) string {
|
||||||
|
if len(s.Nodes) == 0 {
|
||||||
|
return defaultValue
|
||||||
|
}
|
||||||
|
|
||||||
|
val, exists := getAttributeValue(attrName, s.Nodes[0])
|
||||||
|
if !exists {
|
||||||
|
return defaultValue
|
||||||
|
}
|
||||||
|
|
||||||
|
return val
|
||||||
|
}
|
||||||
|
|
||||||
|
// RemoveAttr removes the named attribute from each element in the set of matched elements.
|
||||||
|
func (s *Selection) RemoveAttr(attrName string) *Selection {
|
||||||
|
for _, n := range s.Nodes {
|
||||||
|
removeAttr(n, attrName)
|
||||||
|
}
|
||||||
|
|
||||||
|
return s
|
||||||
|
}
|
||||||
|
|
||||||
|
// SetAttr sets the given attribute on each element in the set of matched elements.
|
||||||
|
func (s *Selection) SetAttr(attrName, val string) *Selection {
|
||||||
|
for _, n := range s.Nodes {
|
||||||
|
attr := getAttributePtr(attrName, n)
|
||||||
|
if attr == nil {
|
||||||
|
n.Attr = append(n.Attr, html.Attribute{Key: attrName, Val: val})
|
||||||
|
} else {
|
||||||
|
attr.Val = val
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
return s
|
||||||
|
}
|
||||||
|
|
||||||
|
// Text gets the combined text contents of each element in the set of matched
|
||||||
|
// elements, including their descendants.
|
||||||
|
func (s *Selection) Text() string {
|
||||||
|
var buf bytes.Buffer
|
||||||
|
|
||||||
|
// Slightly optimized vs calling Each: no single selection object created
|
||||||
|
var f func(*html.Node)
|
||||||
|
f = func(n *html.Node) {
|
||||||
|
if n.Type == html.TextNode {
|
||||||
|
// Keep newlines and spaces, like jQuery
|
||||||
|
buf.WriteString(n.Data)
|
||||||
|
}
|
||||||
|
if n.FirstChild != nil {
|
||||||
|
for c := n.FirstChild; c != nil; c = c.NextSibling {
|
||||||
|
f(c)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
for _, n := range s.Nodes {
|
||||||
|
f(n)
|
||||||
|
}
|
||||||
|
|
||||||
|
return buf.String()
|
||||||
|
}
|
||||||
|
|
||||||
|
// Size is an alias for Length.
|
||||||
|
func (s *Selection) Size() int {
|
||||||
|
return s.Length()
|
||||||
|
}
|
||||||
|
|
||||||
|
// Length returns the number of elements in the Selection object.
|
||||||
|
func (s *Selection) Length() int {
|
||||||
|
return len(s.Nodes)
|
||||||
|
}
|
||||||
|
|
||||||
|
// Html gets the HTML contents of the first element in the set of matched
|
||||||
|
// elements. It includes text and comment nodes.
|
||||||
|
func (s *Selection) Html() (ret string, e error) {
|
||||||
|
// Since there is no .innerHtml, the HTML content must be re-created from
|
||||||
|
// the nodes using html.Render.
|
||||||
|
var buf bytes.Buffer
|
||||||
|
|
||||||
|
if len(s.Nodes) > 0 {
|
||||||
|
for c := s.Nodes[0].FirstChild; c != nil; c = c.NextSibling {
|
||||||
|
e = html.Render(&buf, c)
|
||||||
|
if e != nil {
|
||||||
|
return
|
||||||
|
}
|
||||||
|
}
|
||||||
|
ret = buf.String()
|
||||||
|
}
|
||||||
|
|
||||||
|
return
|
||||||
|
}
|
||||||
|
|
||||||
|
// AddClass adds the given class(es) to each element in the set of matched elements.
|
||||||
|
// Multiple class names can be specified, separated by a space or via multiple arguments.
|
||||||
|
func (s *Selection) AddClass(class ...string) *Selection {
|
||||||
|
classStr := strings.TrimSpace(strings.Join(class, " "))
|
||||||
|
|
||||||
|
if classStr == "" {
|
||||||
|
return s
|
||||||
|
}
|
||||||
|
|
||||||
|
tcls := getClassesSlice(classStr)
|
||||||
|
for _, n := range s.Nodes {
|
||||||
|
curClasses, attr := getClassesAndAttr(n, true)
|
||||||
|
for _, newClass := range tcls {
|
||||||
|
if !strings.Contains(curClasses, " "+newClass+" ") {
|
||||||
|
curClasses += newClass + " "
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
setClasses(n, attr, curClasses)
|
||||||
|
}
|
||||||
|
|
||||||
|
return s
|
||||||
|
}
|
||||||
|
|
||||||
|
// HasClass determines whether any of the matched elements are assigned the
|
||||||
|
// given class.
|
||||||
|
func (s *Selection) HasClass(class string) bool {
|
||||||
|
class = " " + class + " "
|
||||||
|
for _, n := range s.Nodes {
|
||||||
|
classes, _ := getClassesAndAttr(n, false)
|
||||||
|
if strings.Contains(classes, class) {
|
||||||
|
return true
|
||||||
|
}
|
||||||
|
}
|
||||||
|
return false
|
||||||
|
}
|
||||||
|
|
||||||
|
// RemoveClass removes the given class(es) from each element in the set of matched elements.
|
||||||
|
// Multiple class names can be specified, separated by a space or via multiple arguments.
|
||||||
|
// If no class name is provided, all classes are removed.
|
||||||
|
func (s *Selection) RemoveClass(class ...string) *Selection {
|
||||||
|
var rclasses []string
|
||||||
|
|
||||||
|
classStr := strings.TrimSpace(strings.Join(class, " "))
|
||||||
|
remove := classStr == ""
|
||||||
|
|
||||||
|
if !remove {
|
||||||
|
rclasses = getClassesSlice(classStr)
|
||||||
|
}
|
||||||
|
|
||||||
|
for _, n := range s.Nodes {
|
||||||
|
if remove {
|
||||||
|
removeAttr(n, "class")
|
||||||
|
} else {
|
||||||
|
classes, attr := getClassesAndAttr(n, true)
|
||||||
|
for _, rcl := range rclasses {
|
||||||
|
classes = strings.Replace(classes, " "+rcl+" ", " ", -1)
|
||||||
|
}
|
||||||
|
|
||||||
|
setClasses(n, attr, classes)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
return s
|
||||||
|
}
|
||||||
|
|
||||||
|
// ToggleClass adds or removes the given class(es) for each element in the set of matched elements.
|
||||||
|
// Multiple class names can be specified, separated by a space or via multiple arguments.
|
||||||
|
func (s *Selection) ToggleClass(class ...string) *Selection {
|
||||||
|
classStr := strings.TrimSpace(strings.Join(class, " "))
|
||||||
|
|
||||||
|
if classStr == "" {
|
||||||
|
return s
|
||||||
|
}
|
||||||
|
|
||||||
|
tcls := getClassesSlice(classStr)
|
||||||
|
|
||||||
|
for _, n := range s.Nodes {
|
||||||
|
classes, attr := getClassesAndAttr(n, true)
|
||||||
|
for _, tcl := range tcls {
|
||||||
|
if strings.Contains(classes, " "+tcl+" ") {
|
||||||
|
classes = strings.Replace(classes, " "+tcl+" ", " ", -1)
|
||||||
|
} else {
|
||||||
|
classes += tcl + " "
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
setClasses(n, attr, classes)
|
||||||
|
}
|
||||||
|
|
||||||
|
return s
|
||||||
|
}
|
||||||
|
|
||||||
|
func getAttributePtr(attrName string, n *html.Node) *html.Attribute {
|
||||||
|
if n == nil {
|
||||||
|
return nil
|
||||||
|
}
|
||||||
|
|
||||||
|
for i, a := range n.Attr {
|
||||||
|
if a.Key == attrName {
|
||||||
|
return &n.Attr[i]
|
||||||
|
}
|
||||||
|
}
|
||||||
|
return nil
|
||||||
|
}
|
||||||
|
|
||||||
|
// Private function to get the specified attribute's value from a node.
|
||||||
|
func getAttributeValue(attrName string, n *html.Node) (val string, exists bool) {
|
||||||
|
if a := getAttributePtr(attrName, n); a != nil {
|
||||||
|
val = a.Val
|
||||||
|
exists = true
|
||||||
|
}
|
||||||
|
return
|
||||||
|
}
|
||||||
|
|
||||||
|
// Get and normalize the "class" attribute from the node.
|
||||||
|
func getClassesAndAttr(n *html.Node, create bool) (classes string, attr *html.Attribute) {
|
||||||
|
// Applies only to element nodes
|
||||||
|
if n.Type == html.ElementNode {
|
||||||
|
attr = getAttributePtr("class", n)
|
||||||
|
if attr == nil && create {
|
||||||
|
n.Attr = append(n.Attr, html.Attribute{
|
||||||
|
Key: "class",
|
||||||
|
Val: "",
|
||||||
|
})
|
||||||
|
attr = &n.Attr[len(n.Attr)-1]
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
if attr == nil {
|
||||||
|
classes = " "
|
||||||
|
} else {
|
||||||
|
classes = rxClassTrim.ReplaceAllString(" "+attr.Val+" ", " ")
|
||||||
|
}
|
||||||
|
|
||||||
|
return
|
||||||
|
}
|
||||||
|
|
||||||
|
func getClassesSlice(classes string) []string {
|
||||||
|
return strings.Split(rxClassTrim.ReplaceAllString(" "+classes+" ", " "), " ")
|
||||||
|
}
|
||||||
|
|
||||||
|
func removeAttr(n *html.Node, attrName string) {
|
||||||
|
for i, a := range n.Attr {
|
||||||
|
if a.Key == attrName {
|
||||||
|
n.Attr[i], n.Attr[len(n.Attr)-1], n.Attr =
|
||||||
|
n.Attr[len(n.Attr)-1], html.Attribute{}, n.Attr[:len(n.Attr)-1]
|
||||||
|
return
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
func setClasses(n *html.Node, attr *html.Attribute, classes string) {
|
||||||
|
classes = strings.TrimSpace(classes)
|
||||||
|
if classes == "" {
|
||||||
|
removeAttr(n, "class")
|
||||||
|
return
|
||||||
|
}
|
||||||
|
|
||||||
|
attr.Val = classes
|
||||||
|
}
|
49
vendor/github.com/PuerkitoBio/goquery/query.go
generated
vendored
Normal file
49
vendor/github.com/PuerkitoBio/goquery/query.go
generated
vendored
Normal file
|
@ -0,0 +1,49 @@
|
||||||
|
package goquery
|
||||||
|
|
||||||
|
import "golang.org/x/net/html"
|
||||||
|
|
||||||
|
// Is checks the current matched set of elements against a selector and
|
||||||
|
// returns true if at least one of these elements matches.
|
||||||
|
func (s *Selection) Is(selector string) bool {
|
||||||
|
return s.IsMatcher(compileMatcher(selector))
|
||||||
|
}
|
||||||
|
|
||||||
|
// IsMatcher checks the current matched set of elements against a matcher and
|
||||||
|
// returns true if at least one of these elements matches.
|
||||||
|
func (s *Selection) IsMatcher(m Matcher) bool {
|
||||||
|
if len(s.Nodes) > 0 {
|
||||||
|
if len(s.Nodes) == 1 {
|
||||||
|
return m.Match(s.Nodes[0])
|
||||||
|
}
|
||||||
|
return len(m.Filter(s.Nodes)) > 0
|
||||||
|
}
|
||||||
|
|
||||||
|
return false
|
||||||
|
}
|
||||||
|
|
||||||
|
// IsFunction checks the current matched set of elements against a predicate and
|
||||||
|
// returns true if at least one of these elements matches.
|
||||||
|
func (s *Selection) IsFunction(f func(int, *Selection) bool) bool {
|
||||||
|
return s.FilterFunction(f).Length() > 0
|
||||||
|
}
|
||||||
|
|
||||||
|
// IsSelection checks the current matched set of elements against a Selection object
|
||||||
|
// and returns true if at least one of these elements matches.
|
||||||
|
func (s *Selection) IsSelection(sel *Selection) bool {
|
||||||
|
return s.FilterSelection(sel).Length() > 0
|
||||||
|
}
|
||||||
|
|
||||||
|
// IsNodes checks the current matched set of elements against the specified nodes
|
||||||
|
// and returns true if at least one of these elements matches.
|
||||||
|
func (s *Selection) IsNodes(nodes ...*html.Node) bool {
|
||||||
|
return s.FilterNodes(nodes...).Length() > 0
|
||||||
|
}
|
||||||
|
|
||||||
|
// Contains returns true if the specified Node is within,
|
||||||
|
// at any depth, one of the nodes in the Selection object.
|
||||||
|
// It is NOT inclusive, to behave like jQuery's implementation, and
|
||||||
|
// unlike Javascript's .contains, so if the contained
|
||||||
|
// node is itself in the selection, it returns false.
|
||||||
|
func (s *Selection) Contains(n *html.Node) bool {
|
||||||
|
return sliceContains(s.Nodes, n)
|
||||||
|
}
|
698
vendor/github.com/PuerkitoBio/goquery/traversal.go
generated
vendored
Normal file
698
vendor/github.com/PuerkitoBio/goquery/traversal.go
generated
vendored
Normal file
|
@ -0,0 +1,698 @@
|
||||||
|
package goquery
|
||||||
|
|
||||||
|
import "golang.org/x/net/html"
|
||||||
|
|
||||||
|
type siblingType int
|
||||||
|
|
||||||
|
// Sibling type, used internally when iterating over children at the same
|
||||||
|
// level (siblings) to specify which nodes are requested.
|
||||||
|
const (
|
||||||
|
siblingPrevUntil siblingType = iota - 3
|
||||||
|
siblingPrevAll
|
||||||
|
siblingPrev
|
||||||
|
siblingAll
|
||||||
|
siblingNext
|
||||||
|
siblingNextAll
|
||||||
|
siblingNextUntil
|
||||||
|
siblingAllIncludingNonElements
|
||||||
|
)
|
||||||
|
|
||||||
|
// Find gets the descendants of each element in the current set of matched
|
||||||
|
// elements, filtered by a selector. It returns a new Selection object
|
||||||
|
// containing these matched elements.
|
||||||
|
func (s *Selection) Find(selector string) *Selection {
|
||||||
|
return pushStack(s, findWithMatcher(s.Nodes, compileMatcher(selector)))
|
||||||
|
}
|
||||||
|
|
||||||
|
// FindMatcher gets the descendants of each element in the current set of matched
|
||||||
|
// elements, filtered by the matcher. It returns a new Selection object
|
||||||
|
// containing these matched elements.
|
||||||
|
func (s *Selection) FindMatcher(m Matcher) *Selection {
|
||||||
|
return pushStack(s, findWithMatcher(s.Nodes, m))
|
||||||
|
}
|
||||||
|
|
||||||
|
// FindSelection gets the descendants of each element in the current
|
||||||
|
// Selection, filtered by a Selection. It returns a new Selection object
|
||||||
|
// containing these matched elements.
|
||||||
|
func (s *Selection) FindSelection(sel *Selection) *Selection {
|
||||||
|
if sel == nil {
|
||||||
|
return pushStack(s, nil)
|
||||||
|
}
|
||||||
|
return s.FindNodes(sel.Nodes...)
|
||||||
|
}
|
||||||
|
|
||||||
|
// FindNodes gets the descendants of each element in the current
|
||||||
|
// Selection, filtered by some nodes. It returns a new Selection object
|
||||||
|
// containing these matched elements.
|
||||||
|
func (s *Selection) FindNodes(nodes ...*html.Node) *Selection {
|
||||||
|
return pushStack(s, mapNodes(nodes, func(i int, n *html.Node) []*html.Node {
|
||||||
|
if sliceContains(s.Nodes, n) {
|
||||||
|
return []*html.Node{n}
|
||||||
|
}
|
||||||
|
return nil
|
||||||
|
}))
|
||||||
|
}
|
||||||
|
|
||||||
|
// Contents gets the children of each element in the Selection,
|
||||||
|
// including text and comment nodes. It returns a new Selection object
|
||||||
|
// containing these elements.
|
||||||
|
func (s *Selection) Contents() *Selection {
|
||||||
|
return pushStack(s, getChildrenNodes(s.Nodes, siblingAllIncludingNonElements))
|
||||||
|
}
|
||||||
|
|
||||||
|
// ContentsFiltered gets the children of each element in the Selection,
|
||||||
|
// filtered by the specified selector. It returns a new Selection
|
||||||
|
// object containing these elements. Since selectors only act on Element nodes,
|
||||||
|
// this function is an alias to ChildrenFiltered unless the selector is empty,
|
||||||
|
// in which case it is an alias to Contents.
|
||||||
|
func (s *Selection) ContentsFiltered(selector string) *Selection {
|
||||||
|
if selector != "" {
|
||||||
|
return s.ChildrenFiltered(selector)
|
||||||
|
}
|
||||||
|
return s.Contents()
|
||||||
|
}
|
||||||
|
|
||||||
|
// ContentsMatcher gets the children of each element in the Selection,
|
||||||
|
// filtered by the specified matcher. It returns a new Selection
|
||||||
|
// object containing these elements. Since matchers only act on Element nodes,
|
||||||
|
// this function is an alias to ChildrenMatcher.
|
||||||
|
func (s *Selection) ContentsMatcher(m Matcher) *Selection {
|
||||||
|
return s.ChildrenMatcher(m)
|
||||||
|
}
|
||||||
|
|
||||||
|
// Children gets the child elements of each element in the Selection.
|
||||||
|
// It returns a new Selection object containing these elements.
|
||||||
|
func (s *Selection) Children() *Selection {
|
||||||
|
return pushStack(s, getChildrenNodes(s.Nodes, siblingAll))
|
||||||
|
}
|
||||||
|
|
||||||
|
// ChildrenFiltered gets the child elements of each element in the Selection,
|
||||||
|
// filtered by the specified selector. It returns a new
|
||||||
|
// Selection object containing these elements.
|
||||||
|
func (s *Selection) ChildrenFiltered(selector string) *Selection {
|
||||||
|
return filterAndPush(s, getChildrenNodes(s.Nodes, siblingAll), compileMatcher(selector))
|
||||||
|
}
|
||||||
|
|
||||||
|
// ChildrenMatcher gets the child elements of each element in the Selection,
|
||||||
|
// filtered by the specified matcher. It returns a new
|
||||||
|
// Selection object containing these elements.
|
||||||
|
func (s *Selection) ChildrenMatcher(m Matcher) *Selection {
|
||||||
|
return filterAndPush(s, getChildrenNodes(s.Nodes, siblingAll), m)
|
||||||
|
}
|
||||||
|
|
||||||
|
// Parent gets the parent of each element in the Selection. It returns a
|
||||||
|
// new Selection object containing the matched elements.
|
||||||
|
func (s *Selection) Parent() *Selection {
|
||||||
|
return pushStack(s, getParentNodes(s.Nodes))
|
||||||
|
}
|
||||||
|
|
||||||
|
// ParentFiltered gets the parent of each element in the Selection filtered by a
|
||||||
|
// selector. It returns a new Selection object containing the matched elements.
|
||||||
|
func (s *Selection) ParentFiltered(selector string) *Selection {
|
||||||
|
return filterAndPush(s, getParentNodes(s.Nodes), compileMatcher(selector))
|
||||||
|
}
|
||||||
|
|
||||||
|
// ParentMatcher gets the parent of each element in the Selection filtered by a
|
||||||
|
// matcher. It returns a new Selection object containing the matched elements.
|
||||||
|
func (s *Selection) ParentMatcher(m Matcher) *Selection {
|
||||||
|
return filterAndPush(s, getParentNodes(s.Nodes), m)
|
||||||
|
}
|
||||||
|
|
||||||
|
// Closest gets the first element that matches the selector by testing the
|
||||||
|
// element itself and traversing up through its ancestors in the DOM tree.
|
||||||
|
func (s *Selection) Closest(selector string) *Selection {
|
||||||
|
cs := compileMatcher(selector)
|
||||||
|
return s.ClosestMatcher(cs)
|
||||||
|
}
|
||||||
|
|
||||||
|
// ClosestMatcher gets the first element that matches the matcher by testing the
|
||||||
|
// element itself and traversing up through its ancestors in the DOM tree.
|
||||||
|
func (s *Selection) ClosestMatcher(m Matcher) *Selection {
|
||||||
|
return pushStack(s, mapNodes(s.Nodes, func(i int, n *html.Node) []*html.Node {
|
||||||
|
// For each node in the selection, test the node itself, then each parent
|
||||||
|
// until a match is found.
|
||||||
|
for ; n != nil; n = n.Parent {
|
||||||
|
if m.Match(n) {
|
||||||
|
return []*html.Node{n}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
return nil
|
||||||
|
}))
|
||||||
|
}
|
||||||
|
|
||||||
|
// ClosestNodes gets the first element that matches one of the nodes by testing the
|
||||||
|
// element itself and traversing up through its ancestors in the DOM tree.
|
||||||
|
func (s *Selection) ClosestNodes(nodes ...*html.Node) *Selection {
|
||||||
|
set := make(map[*html.Node]bool)
|
||||||
|
for _, n := range nodes {
|
||||||
|
set[n] = true
|
||||||
|
}
|
||||||
|
return pushStack(s, mapNodes(s.Nodes, func(i int, n *html.Node) []*html.Node {
|
||||||
|
// For each node in the selection, test the node itself, then each parent
|
||||||
|
// until a match is found.
|
||||||
|
for ; n != nil; n = n.Parent {
|
||||||
|
if set[n] {
|
||||||
|
return []*html.Node{n}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
return nil
|
||||||
|
}))
|
||||||
|
}
|
||||||
|
|
||||||
|
// ClosestSelection gets the first element that matches one of the nodes in the
|
||||||
|
// Selection by testing the element itself and traversing up through its ancestors
|
||||||
|
// in the DOM tree.
|
||||||
|
func (s *Selection) ClosestSelection(sel *Selection) *Selection {
|
||||||
|
if sel == nil {
|
||||||
|
return pushStack(s, nil)
|
||||||
|
}
|
||||||
|
return s.ClosestNodes(sel.Nodes...)
|
||||||
|
}
|
||||||
|
|
||||||
|
// Parents gets the ancestors of each element in the current Selection. It
|
||||||
|
// returns a new Selection object with the matched elements.
|
||||||
|
func (s *Selection) Parents() *Selection {
|
||||||
|
return pushStack(s, getParentsNodes(s.Nodes, nil, nil))
|
||||||
|
}
|
||||||
|
|
||||||
|
// ParentsFiltered gets the ancestors of each element in the current
|
||||||
|
// Selection. It returns a new Selection object with the matched elements.
|
||||||
|
func (s *Selection) ParentsFiltered(selector string) *Selection {
|
||||||
|
return filterAndPush(s, getParentsNodes(s.Nodes, nil, nil), compileMatcher(selector))
|
||||||
|
}
|
||||||
|
|
||||||
|
// ParentsMatcher gets the ancestors of each element in the current
|
||||||
|
// Selection. It returns a new Selection object with the matched elements.
|
||||||
|
func (s *Selection) ParentsMatcher(m Matcher) *Selection {
|
||||||
|
return filterAndPush(s, getParentsNodes(s.Nodes, nil, nil), m)
|
||||||
|
}
|
||||||
|
|
||||||
|
// ParentsUntil gets the ancestors of each element in the Selection, up to but
|
||||||
|
// not including the element matched by the selector. It returns a new Selection
|
||||||
|
// object containing the matched elements.
|
||||||
|
func (s *Selection) ParentsUntil(selector string) *Selection {
|
||||||
|
return pushStack(s, getParentsNodes(s.Nodes, compileMatcher(selector), nil))
|
||||||
|
}
|
||||||
|
|
||||||
|
// ParentsUntilMatcher gets the ancestors of each element in the Selection, up to but
|
||||||
|
// not including the element matched by the matcher. It returns a new Selection
|
||||||
|
// object containing the matched elements.
|
||||||
|
func (s *Selection) ParentsUntilMatcher(m Matcher) *Selection {
|
||||||
|
return pushStack(s, getParentsNodes(s.Nodes, m, nil))
|
||||||
|
}
|
||||||
|
|
||||||
|
// ParentsUntilSelection gets the ancestors of each element in the Selection,
|
||||||
|
// up to but not including the elements in the specified Selection. It returns a
|
||||||
|
// new Selection object containing the matched elements.
|
||||||
|
func (s *Selection) ParentsUntilSelection(sel *Selection) *Selection {
|
||||||
|
if sel == nil {
|
||||||
|
return s.Parents()
|
||||||
|
}
|
||||||
|
return s.ParentsUntilNodes(sel.Nodes...)
|
||||||
|
}
|
||||||
|
|
||||||
|
// ParentsUntilNodes gets the ancestors of each element in the Selection,
|
||||||
|
// up to but not including the specified nodes. It returns a
|
||||||
|
// new Selection object containing the matched elements.
|
||||||
|
func (s *Selection) ParentsUntilNodes(nodes ...*html.Node) *Selection {
|
||||||
|
return pushStack(s, getParentsNodes(s.Nodes, nil, nodes))
|
||||||
|
}
|
||||||
|
|
||||||
|
// ParentsFilteredUntil is like ParentsUntil, with the option to filter the
|
||||||
|
// results based on a selector string. It returns a new Selection
|
||||||
|
// object containing the matched elements.
|
||||||
|
func (s *Selection) ParentsFilteredUntil(filterSelector, untilSelector string) *Selection {
|
||||||
|
return filterAndPush(s, getParentsNodes(s.Nodes, compileMatcher(untilSelector), nil), compileMatcher(filterSelector))
|
||||||
|
}
|
||||||
|
|
||||||
|
// ParentsFilteredUntilMatcher is like ParentsUntilMatcher, with the option to filter the
|
||||||
|
// results based on a matcher. It returns a new Selection object containing the matched elements.
|
||||||
|
func (s *Selection) ParentsFilteredUntilMatcher(filter, until Matcher) *Selection {
|
||||||
|
return filterAndPush(s, getParentsNodes(s.Nodes, until, nil), filter)
|
||||||
|
}
|
||||||
|
|
||||||
|
// ParentsFilteredUntilSelection is like ParentsUntilSelection, with the
|
||||||
|
// option to filter the results based on a selector string. It returns a new
|
||||||
|
// Selection object containing the matched elements.
|
||||||
|
func (s *Selection) ParentsFilteredUntilSelection(filterSelector string, sel *Selection) *Selection {
|
||||||
|
return s.ParentsMatcherUntilSelection(compileMatcher(filterSelector), sel)
|
||||||
|
}
|
||||||
|
|
||||||
|
// ParentsMatcherUntilSelection is like ParentsUntilSelection, with the
|
||||||
|
// option to filter the results based on a matcher. It returns a new
|
||||||
|
// Selection object containing the matched elements.
|
||||||
|
func (s *Selection) ParentsMatcherUntilSelection(filter Matcher, sel *Selection) *Selection {
|
||||||
|
if sel == nil {
|
||||||
|
return s.ParentsMatcher(filter)
|
||||||
|
}
|
||||||
|
return s.ParentsMatcherUntilNodes(filter, sel.Nodes...)
|
||||||
|
}
|
||||||
|
|
||||||
|
// ParentsFilteredUntilNodes is like ParentsUntilNodes, with the
|
||||||
|
// option to filter the results based on a selector string. It returns a new
|
||||||
|
// Selection object containing the matched elements.
|
||||||
|
func (s *Selection) ParentsFilteredUntilNodes(filterSelector string, nodes ...*html.Node) *Selection {
|
||||||
|
return filterAndPush(s, getParentsNodes(s.Nodes, nil, nodes), compileMatcher(filterSelector))
|
||||||
|
}
|
||||||
|
|
||||||
|
// ParentsMatcherUntilNodes is like ParentsUntilNodes, with the
|
||||||
|
// option to filter the results based on a matcher. It returns a new
|
||||||
|
// Selection object containing the matched elements.
|
||||||
|
func (s *Selection) ParentsMatcherUntilNodes(filter Matcher, nodes ...*html.Node) *Selection {
|
||||||
|
return filterAndPush(s, getParentsNodes(s.Nodes, nil, nodes), filter)
|
||||||
|
}
|
||||||
|
|
||||||
|
// Siblings gets the siblings of each element in the Selection. It returns
|
||||||
|
// a new Selection object containing the matched elements.
|
||||||
|
func (s *Selection) Siblings() *Selection {
|
||||||
|
return pushStack(s, getSiblingNodes(s.Nodes, siblingAll, nil, nil))
|
||||||
|
}
|
||||||
|
|
||||||
|
// SiblingsFiltered gets the siblings of each element in the Selection
|
||||||
|
// filtered by a selector. It returns a new Selection object containing the
|
||||||
|
// matched elements.
|
||||||
|
func (s *Selection) SiblingsFiltered(selector string) *Selection {
|
||||||
|
return filterAndPush(s, getSiblingNodes(s.Nodes, siblingAll, nil, nil), compileMatcher(selector))
|
||||||
|
}
|
||||||
|
|
||||||
|
// SiblingsMatcher gets the siblings of each element in the Selection
|
||||||
|
// filtered by a matcher. It returns a new Selection object containing the
|
||||||
|
// matched elements.
|
||||||
|
func (s *Selection) SiblingsMatcher(m Matcher) *Selection {
|
||||||
|
return filterAndPush(s, getSiblingNodes(s.Nodes, siblingAll, nil, nil), m)
|
||||||
|
}
|
||||||
|
|
||||||
|
// Next gets the immediately following sibling of each element in the
|
||||||
|
// Selection. It returns a new Selection object containing the matched elements.
|
||||||
|
func (s *Selection) Next() *Selection {
|
||||||
|
return pushStack(s, getSiblingNodes(s.Nodes, siblingNext, nil, nil))
|
||||||
|
}
|
||||||
|
|
||||||
|
// NextFiltered gets the immediately following sibling of each element in the
|
||||||
|
// Selection filtered by a selector. It returns a new Selection object
|
||||||
|
// containing the matched elements.
|
||||||
|
func (s *Selection) NextFiltered(selector string) *Selection {
|
||||||
|
return filterAndPush(s, getSiblingNodes(s.Nodes, siblingNext, nil, nil), compileMatcher(selector))
|
||||||
|
}
|
||||||
|
|
||||||
|
// NextMatcher gets the immediately following sibling of each element in the
|
||||||
|
// Selection filtered by a matcher. It returns a new Selection object
|
||||||
|
// containing the matched elements.
|
||||||
|
func (s *Selection) NextMatcher(m Matcher) *Selection {
|
||||||
|
return filterAndPush(s, getSiblingNodes(s.Nodes, siblingNext, nil, nil), m)
|
||||||
|
}
|
||||||
|
|
||||||
|
// NextAll gets all the following siblings of each element in the
|
||||||
|
// Selection. It returns a new Selection object containing the matched elements.
|
||||||
|
func (s *Selection) NextAll() *Selection {
|
||||||
|
return pushStack(s, getSiblingNodes(s.Nodes, siblingNextAll, nil, nil))
|
||||||
|
}
|
||||||
|
|
||||||
|
// NextAllFiltered gets all the following siblings of each element in the
|
||||||
|
// Selection filtered by a selector. It returns a new Selection object
|
||||||
|
// containing the matched elements.
|
||||||
|
func (s *Selection) NextAllFiltered(selector string) *Selection {
|
||||||
|
return filterAndPush(s, getSiblingNodes(s.Nodes, siblingNextAll, nil, nil), compileMatcher(selector))
|
||||||
|
}
|
||||||
|
|
||||||
|
// NextAllMatcher gets all the following siblings of each element in the
|
||||||
|
// Selection filtered by a matcher. It returns a new Selection object
|
||||||
|
// containing the matched elements.
|
||||||
|
func (s *Selection) NextAllMatcher(m Matcher) *Selection {
|
||||||
|
return filterAndPush(s, getSiblingNodes(s.Nodes, siblingNextAll, nil, nil), m)
|
||||||
|
}
|
||||||
|
|
||||||
|
// Prev gets the immediately preceding sibling of each element in the
|
||||||
|
// Selection. It returns a new Selection object containing the matched elements.
|
||||||
|
func (s *Selection) Prev() *Selection {
|
||||||
|
return pushStack(s, getSiblingNodes(s.Nodes, siblingPrev, nil, nil))
|
||||||
|
}
|
||||||
|
|
||||||
|
// PrevFiltered gets the immediately preceding sibling of each element in the
|
||||||
|
// Selection filtered by a selector. It returns a new Selection object
|
||||||
|
// containing the matched elements.
|
||||||
|
func (s *Selection) PrevFiltered(selector string) *Selection {
|
||||||
|
return filterAndPush(s, getSiblingNodes(s.Nodes, siblingPrev, nil, nil), compileMatcher(selector))
|
||||||
|
}
|
||||||
|
|
||||||
|
// PrevMatcher gets the immediately preceding sibling of each element in the
|
||||||
|
// Selection filtered by a matcher. It returns a new Selection object
|
||||||
|
// containing the matched elements.
|
||||||
|
func (s *Selection) PrevMatcher(m Matcher) *Selection {
|
||||||
|
return filterAndPush(s, getSiblingNodes(s.Nodes, siblingPrev, nil, nil), m)
|
||||||
|
}
|
||||||
|
|
||||||
|
// PrevAll gets all the preceding siblings of each element in the
|
||||||
|
// Selection. It returns a new Selection object containing the matched elements.
|
||||||
|
func (s *Selection) PrevAll() *Selection {
|
||||||
|
return pushStack(s, getSiblingNodes(s.Nodes, siblingPrevAll, nil, nil))
|
||||||
|
}
|
||||||
|
|
||||||
|
// PrevAllFiltered gets all the preceding siblings of each element in the
|
||||||
|
// Selection filtered by a selector. It returns a new Selection object
|
||||||
|
// containing the matched elements.
|
||||||
|
func (s *Selection) PrevAllFiltered(selector string) *Selection {
|
||||||
|
return filterAndPush(s, getSiblingNodes(s.Nodes, siblingPrevAll, nil, nil), compileMatcher(selector))
|
||||||
|
}
|
||||||
|
|
||||||
|
// PrevAllMatcher gets all the preceding siblings of each element in the
|
||||||
|
// Selection filtered by a matcher. It returns a new Selection object
|
||||||
|
// containing the matched elements.
|
||||||
|
func (s *Selection) PrevAllMatcher(m Matcher) *Selection {
|
||||||
|
return filterAndPush(s, getSiblingNodes(s.Nodes, siblingPrevAll, nil, nil), m)
|
||||||
|
}
|
||||||
|
|
||||||
|
// NextUntil gets all following siblings of each element up to but not
|
||||||
|
// including the element matched by the selector. It returns a new Selection
|
||||||
|
// object containing the matched elements.
|
||||||
|
func (s *Selection) NextUntil(selector string) *Selection {
|
||||||
|
return pushStack(s, getSiblingNodes(s.Nodes, siblingNextUntil,
|
||||||
|
compileMatcher(selector), nil))
|
||||||
|
}
|
||||||
|
|
||||||
|
// NextUntilMatcher gets all following siblings of each element up to but not
|
||||||
|
// including the element matched by the matcher. It returns a new Selection
|
||||||
|
// object containing the matched elements.
|
||||||
|
func (s *Selection) NextUntilMatcher(m Matcher) *Selection {
|
||||||
|
return pushStack(s, getSiblingNodes(s.Nodes, siblingNextUntil,
|
||||||
|
m, nil))
|
||||||
|
}
|
||||||
|
|
||||||
|
// NextUntilSelection gets all following siblings of each element up to but not
|
||||||
|
// including the element matched by the Selection. It returns a new Selection
|
||||||
|
// object containing the matched elements.
|
||||||
|
func (s *Selection) NextUntilSelection(sel *Selection) *Selection {
|
||||||
|
if sel == nil {
|
||||||
|
return s.NextAll()
|
||||||
|
}
|
||||||
|
return s.NextUntilNodes(sel.Nodes...)
|
||||||
|
}
|
||||||
|
|
||||||
|
// NextUntilNodes gets all following siblings of each element up to but not
|
||||||
|
// including the element matched by the nodes. It returns a new Selection
|
||||||
|
// object containing the matched elements.
|
||||||
|
func (s *Selection) NextUntilNodes(nodes ...*html.Node) *Selection {
|
||||||
|
return pushStack(s, getSiblingNodes(s.Nodes, siblingNextUntil,
|
||||||
|
nil, nodes))
|
||||||
|
}
|
||||||
|
|
||||||
|
// PrevUntil gets all preceding siblings of each element up to but not
|
||||||
|
// including the element matched by the selector. It returns a new Selection
|
||||||
|
// object containing the matched elements.
|
||||||
|
func (s *Selection) PrevUntil(selector string) *Selection {
|
||||||
|
return pushStack(s, getSiblingNodes(s.Nodes, siblingPrevUntil,
|
||||||
|
compileMatcher(selector), nil))
|
||||||
|
}
|
||||||
|
|
||||||
|
// PrevUntilMatcher gets all preceding siblings of each element up to but not
|
||||||
|
// including the element matched by the matcher. It returns a new Selection
|
||||||
|
// object containing the matched elements.
|
||||||
|
func (s *Selection) PrevUntilMatcher(m Matcher) *Selection {
|
||||||
|
return pushStack(s, getSiblingNodes(s.Nodes, siblingPrevUntil,
|
||||||
|
m, nil))
|
||||||
|
}
|
||||||
|
|
||||||
|
// PrevUntilSelection gets all preceding siblings of each element up to but not
|
||||||
|
// including the element matched by the Selection. It returns a new Selection
|
||||||
|
// object containing the matched elements.
|
||||||
|
func (s *Selection) PrevUntilSelection(sel *Selection) *Selection {
|
||||||
|
if sel == nil {
|
||||||
|
return s.PrevAll()
|
||||||
|
}
|
||||||
|
return s.PrevUntilNodes(sel.Nodes...)
|
||||||
|
}
|
||||||
|
|
||||||
|
// PrevUntilNodes gets all preceding siblings of each element up to but not
|
||||||
|
// including the element matched by the nodes. It returns a new Selection
|
||||||
|
// object containing the matched elements.
|
||||||
|
func (s *Selection) PrevUntilNodes(nodes ...*html.Node) *Selection {
|
||||||
|
return pushStack(s, getSiblingNodes(s.Nodes, siblingPrevUntil,
|
||||||
|
nil, nodes))
|
||||||
|
}
|
||||||
|
|
||||||
|
// NextFilteredUntil is like NextUntil, with the option to filter
|
||||||
|
// the results based on a selector string.
|
||||||
|
// It returns a new Selection object containing the matched elements.
|
||||||
|
func (s *Selection) NextFilteredUntil(filterSelector, untilSelector string) *Selection {
|
||||||
|
return filterAndPush(s, getSiblingNodes(s.Nodes, siblingNextUntil,
|
||||||
|
compileMatcher(untilSelector), nil), compileMatcher(filterSelector))
|
||||||
|
}
|
||||||
|
|
||||||
|
// NextFilteredUntilMatcher is like NextUntilMatcher, with the option to filter
|
||||||
|
// the results based on a matcher.
|
||||||
|
// It returns a new Selection object containing the matched elements.
|
||||||
|
func (s *Selection) NextFilteredUntilMatcher(filter, until Matcher) *Selection {
|
||||||
|
return filterAndPush(s, getSiblingNodes(s.Nodes, siblingNextUntil,
|
||||||
|
until, nil), filter)
|
||||||
|
}
|
||||||
|
|
||||||
|
// NextFilteredUntilSelection is like NextUntilSelection, with the
|
||||||
|
// option to filter the results based on a selector string. It returns a new
|
||||||
|
// Selection object containing the matched elements.
|
||||||
|
func (s *Selection) NextFilteredUntilSelection(filterSelector string, sel *Selection) *Selection {
|
||||||
|
return s.NextMatcherUntilSelection(compileMatcher(filterSelector), sel)
|
||||||
|
}
|
||||||
|
|
||||||
|
// NextMatcherUntilSelection is like NextUntilSelection, with the
|
||||||
|
// option to filter the results based on a matcher. It returns a new
|
||||||
|
// Selection object containing the matched elements.
|
||||||
|
func (s *Selection) NextMatcherUntilSelection(filter Matcher, sel *Selection) *Selection {
|
||||||
|
if sel == nil {
|
||||||
|
return s.NextMatcher(filter)
|
||||||
|
}
|
||||||
|
return s.NextMatcherUntilNodes(filter, sel.Nodes...)
|
||||||
|
}
|
||||||
|
|
||||||
|
// NextFilteredUntilNodes is like NextUntilNodes, with the
|
||||||
|
// option to filter the results based on a selector string. It returns a new
|
||||||
|
// Selection object containing the matched elements.
|
||||||
|
func (s *Selection) NextFilteredUntilNodes(filterSelector string, nodes ...*html.Node) *Selection {
|
||||||
|
return filterAndPush(s, getSiblingNodes(s.Nodes, siblingNextUntil,
|
||||||
|
nil, nodes), compileMatcher(filterSelector))
|
||||||
|
}
|
||||||
|
|
||||||
|
// NextMatcherUntilNodes is like NextUntilNodes, with the
|
||||||
|
// option to filter the results based on a matcher. It returns a new
|
||||||
|
// Selection object containing the matched elements.
|
||||||
|
func (s *Selection) NextMatcherUntilNodes(filter Matcher, nodes ...*html.Node) *Selection {
|
||||||
|
return filterAndPush(s, getSiblingNodes(s.Nodes, siblingNextUntil,
|
||||||
|
nil, nodes), filter)
|
||||||
|
}
|
||||||
|
|
||||||
|
// PrevFilteredUntil is like PrevUntil, with the option to filter
|
||||||
|
// the results based on a selector string.
|
||||||
|
// It returns a new Selection object containing the matched elements.
|
||||||
|
func (s *Selection) PrevFilteredUntil(filterSelector, untilSelector string) *Selection {
|
||||||
|
return filterAndPush(s, getSiblingNodes(s.Nodes, siblingPrevUntil,
|
||||||
|
compileMatcher(untilSelector), nil), compileMatcher(filterSelector))
|
||||||
|
}
|
||||||
|
|
||||||
|
// PrevFilteredUntilMatcher is like PrevUntilMatcher, with the option to filter
|
||||||
|
// the results based on a matcher.
|
||||||
|
// It returns a new Selection object containing the matched elements.
|
||||||
|
func (s *Selection) PrevFilteredUntilMatcher(filter, until Matcher) *Selection {
|
||||||
|
return filterAndPush(s, getSiblingNodes(s.Nodes, siblingPrevUntil,
|
||||||
|
until, nil), filter)
|
||||||
|
}
|
||||||
|
|
||||||
|
// PrevFilteredUntilSelection is like PrevUntilSelection, with the
|
||||||
|
// option to filter the results based on a selector string. It returns a new
|
||||||
|
// Selection object containing the matched elements.
|
||||||
|
func (s *Selection) PrevFilteredUntilSelection(filterSelector string, sel *Selection) *Selection {
|
||||||
|
return s.PrevMatcherUntilSelection(compileMatcher(filterSelector), sel)
|
||||||
|
}
|
||||||
|
|
||||||
|
// PrevMatcherUntilSelection is like PrevUntilSelection, with the
|
||||||
|
// option to filter the results based on a matcher. It returns a new
|
||||||
|
// Selection object containing the matched elements.
|
||||||
|
func (s *Selection) PrevMatcherUntilSelection(filter Matcher, sel *Selection) *Selection {
|
||||||
|
if sel == nil {
|
||||||
|
return s.PrevMatcher(filter)
|
||||||
|
}
|
||||||
|
return s.PrevMatcherUntilNodes(filter, sel.Nodes...)
|
||||||
|
}
|
||||||
|
|
||||||
|
// PrevFilteredUntilNodes is like PrevUntilNodes, with the
|
||||||
|
// option to filter the results based on a selector string. It returns a new
|
||||||
|
// Selection object containing the matched elements.
|
||||||
|
func (s *Selection) PrevFilteredUntilNodes(filterSelector string, nodes ...*html.Node) *Selection {
|
||||||
|
return filterAndPush(s, getSiblingNodes(s.Nodes, siblingPrevUntil,
|
||||||
|
nil, nodes), compileMatcher(filterSelector))
|
||||||
|
}
|
||||||
|
|
||||||
|
// PrevMatcherUntilNodes is like PrevUntilNodes, with the
|
||||||
|
// option to filter the results based on a matcher. It returns a new
|
||||||
|
// Selection object containing the matched elements.
|
||||||
|
func (s *Selection) PrevMatcherUntilNodes(filter Matcher, nodes ...*html.Node) *Selection {
|
||||||
|
return filterAndPush(s, getSiblingNodes(s.Nodes, siblingPrevUntil,
|
||||||
|
nil, nodes), filter)
|
||||||
|
}
|
||||||
|
|
||||||
|
// Filter and push filters the nodes based on a matcher, and pushes the results
|
||||||
|
// on the stack, with the srcSel as previous selection.
|
||||||
|
func filterAndPush(srcSel *Selection, nodes []*html.Node, m Matcher) *Selection {
|
||||||
|
// Create a temporary Selection with the specified nodes to filter using winnow
|
||||||
|
sel := &Selection{nodes, srcSel.document, nil}
|
||||||
|
// Filter based on matcher and push on stack
|
||||||
|
return pushStack(srcSel, winnow(sel, m, true))
|
||||||
|
}
|
||||||
|
|
||||||
|
// Internal implementation of Find that return raw nodes.
|
||||||
|
func findWithMatcher(nodes []*html.Node, m Matcher) []*html.Node {
|
||||||
|
// Map nodes to find the matches within the children of each node
|
||||||
|
return mapNodes(nodes, func(i int, n *html.Node) (result []*html.Node) {
|
||||||
|
// Go down one level, becausejQuery's Find selects only within descendants
|
||||||
|
for c := n.FirstChild; c != nil; c = c.NextSibling {
|
||||||
|
if c.Type == html.ElementNode {
|
||||||
|
result = append(result, m.MatchAll(c)...)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
return
|
||||||
|
})
|
||||||
|
}
|
||||||
|
|
||||||
|
// Internal implementation to get all parent nodes, stopping at the specified
|
||||||
|
// node (or nil if no stop).
|
||||||
|
func getParentsNodes(nodes []*html.Node, stopm Matcher, stopNodes []*html.Node) []*html.Node {
|
||||||
|
return mapNodes(nodes, func(i int, n *html.Node) (result []*html.Node) {
|
||||||
|
for p := n.Parent; p != nil; p = p.Parent {
|
||||||
|
sel := newSingleSelection(p, nil)
|
||||||
|
if stopm != nil {
|
||||||
|
if sel.IsMatcher(stopm) {
|
||||||
|
break
|
||||||
|
}
|
||||||
|
} else if len(stopNodes) > 0 {
|
||||||
|
if sel.IsNodes(stopNodes...) {
|
||||||
|
break
|
||||||
|
}
|
||||||
|
}
|
||||||
|
if p.Type == html.ElementNode {
|
||||||
|
result = append(result, p)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
return
|
||||||
|
})
|
||||||
|
}
|
||||||
|
|
||||||
|
// Internal implementation of sibling nodes that return a raw slice of matches.
|
||||||
|
func getSiblingNodes(nodes []*html.Node, st siblingType, untilm Matcher, untilNodes []*html.Node) []*html.Node {
|
||||||
|
var f func(*html.Node) bool
|
||||||
|
|
||||||
|
// If the requested siblings are ...Until, create the test function to
|
||||||
|
// determine if the until condition is reached (returns true if it is)
|
||||||
|
if st == siblingNextUntil || st == siblingPrevUntil {
|
||||||
|
f = func(n *html.Node) bool {
|
||||||
|
if untilm != nil {
|
||||||
|
// Matcher-based condition
|
||||||
|
sel := newSingleSelection(n, nil)
|
||||||
|
return sel.IsMatcher(untilm)
|
||||||
|
} else if len(untilNodes) > 0 {
|
||||||
|
// Nodes-based condition
|
||||||
|
sel := newSingleSelection(n, nil)
|
||||||
|
return sel.IsNodes(untilNodes...)
|
||||||
|
}
|
||||||
|
return false
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
return mapNodes(nodes, func(i int, n *html.Node) []*html.Node {
|
||||||
|
return getChildrenWithSiblingType(n.Parent, st, n, f)
|
||||||
|
})
|
||||||
|
}
|
||||||
|
|
||||||
|
// Gets the children nodes of each node in the specified slice of nodes,
|
||||||
|
// based on the sibling type request.
|
||||||
|
func getChildrenNodes(nodes []*html.Node, st siblingType) []*html.Node {
|
||||||
|
return mapNodes(nodes, func(i int, n *html.Node) []*html.Node {
|
||||||
|
return getChildrenWithSiblingType(n, st, nil, nil)
|
||||||
|
})
|
||||||
|
}
|
||||||
|
|
||||||
|
// Gets the children of the specified parent, based on the requested sibling
|
||||||
|
// type, skipping a specified node if required.
|
||||||
|
func getChildrenWithSiblingType(parent *html.Node, st siblingType, skipNode *html.Node,
|
||||||
|
untilFunc func(*html.Node) bool) (result []*html.Node) {
|
||||||
|
|
||||||
|
// Create the iterator function
|
||||||
|
var iter = func(cur *html.Node) (ret *html.Node) {
|
||||||
|
// Based on the sibling type requested, iterate the right way
|
||||||
|
for {
|
||||||
|
switch st {
|
||||||
|
case siblingAll, siblingAllIncludingNonElements:
|
||||||
|
if cur == nil {
|
||||||
|
// First iteration, start with first child of parent
|
||||||
|
// Skip node if required
|
||||||
|
if ret = parent.FirstChild; ret == skipNode && skipNode != nil {
|
||||||
|
ret = skipNode.NextSibling
|
||||||
|
}
|
||||||
|
} else {
|
||||||
|
// Skip node if required
|
||||||
|
if ret = cur.NextSibling; ret == skipNode && skipNode != nil {
|
||||||
|
ret = skipNode.NextSibling
|
||||||
|
}
|
||||||
|
}
|
||||||
|
case siblingPrev, siblingPrevAll, siblingPrevUntil:
|
||||||
|
if cur == nil {
|
||||||
|
// Start with previous sibling of the skip node
|
||||||
|
ret = skipNode.PrevSibling
|
||||||
|
} else {
|
||||||
|
ret = cur.PrevSibling
|
||||||
|
}
|
||||||
|
case siblingNext, siblingNextAll, siblingNextUntil:
|
||||||
|
if cur == nil {
|
||||||
|
// Start with next sibling of the skip node
|
||||||
|
ret = skipNode.NextSibling
|
||||||
|
} else {
|
||||||
|
ret = cur.NextSibling
|
||||||
|
}
|
||||||
|
default:
|
||||||
|
panic("Invalid sibling type.")
|
||||||
|
}
|
||||||
|
if ret == nil || ret.Type == html.ElementNode || st == siblingAllIncludingNonElements {
|
||||||
|
return
|
||||||
|
}
|
||||||
|
// Not a valid node, try again from this one
|
||||||
|
cur = ret
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
for c := iter(nil); c != nil; c = iter(c) {
|
||||||
|
// If this is an ...Until case, test before append (returns true
|
||||||
|
// if the until condition is reached)
|
||||||
|
if st == siblingNextUntil || st == siblingPrevUntil {
|
||||||
|
if untilFunc(c) {
|
||||||
|
return
|
||||||
|
}
|
||||||
|
}
|
||||||
|
result = append(result, c)
|
||||||
|
if st == siblingNext || st == siblingPrev {
|
||||||
|
// Only one node was requested (immediate next or previous), so exit
|
||||||
|
return
|
||||||
|
}
|
||||||
|
}
|
||||||
|
return
|
||||||
|
}
|
||||||
|
|
||||||
|
// Internal implementation of parent nodes that return a raw slice of Nodes.
|
||||||
|
func getParentNodes(nodes []*html.Node) []*html.Node {
|
||||||
|
return mapNodes(nodes, func(i int, n *html.Node) []*html.Node {
|
||||||
|
if n.Parent != nil && n.Parent.Type == html.ElementNode {
|
||||||
|
return []*html.Node{n.Parent}
|
||||||
|
}
|
||||||
|
return nil
|
||||||
|
})
|
||||||
|
}
|
||||||
|
|
||||||
|
// Internal map function used by many traversing methods. Takes the source nodes
|
||||||
|
// to iterate on and the mapping function that returns an array of nodes.
|
||||||
|
// Returns an array of nodes mapped by calling the callback function once for
|
||||||
|
// each node in the source nodes.
|
||||||
|
func mapNodes(nodes []*html.Node, f func(int, *html.Node) []*html.Node) (result []*html.Node) {
|
||||||
|
set := make(map[*html.Node]bool)
|
||||||
|
for i, n := range nodes {
|
||||||
|
if vals := f(i, n); len(vals) > 0 {
|
||||||
|
result = appendWithoutDuplicates(result, vals, set)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
return result
|
||||||
|
}
|
141
vendor/github.com/PuerkitoBio/goquery/type.go
generated
vendored
Normal file
141
vendor/github.com/PuerkitoBio/goquery/type.go
generated
vendored
Normal file
|
@ -0,0 +1,141 @@
|
||||||
|
package goquery
|
||||||
|
|
||||||
|
import (
|
||||||
|
"errors"
|
||||||
|
"io"
|
||||||
|
"net/http"
|
||||||
|
"net/url"
|
||||||
|
|
||||||
|
"github.com/andybalholm/cascadia"
|
||||||
|
|
||||||
|
"golang.org/x/net/html"
|
||||||
|
)
|
||||||
|
|
||||||
|
// Document represents an HTML document to be manipulated. Unlike jQuery, which
|
||||||
|
// is loaded as part of a DOM document, and thus acts upon its containing
|
||||||
|
// document, GoQuery doesn't know which HTML document to act upon. So it needs
|
||||||
|
// to be told, and that's what the Document class is for. It holds the root
|
||||||
|
// document node to manipulate, and can make selections on this document.
|
||||||
|
type Document struct {
|
||||||
|
*Selection
|
||||||
|
Url *url.URL
|
||||||
|
rootNode *html.Node
|
||||||
|
}
|
||||||
|
|
||||||
|
// NewDocumentFromNode is a Document constructor that takes a root html Node
|
||||||
|
// as argument.
|
||||||
|
func NewDocumentFromNode(root *html.Node) *Document {
|
||||||
|
return newDocument(root, nil)
|
||||||
|
}
|
||||||
|
|
||||||
|
// NewDocument is a Document constructor that takes a string URL as argument.
|
||||||
|
// It loads the specified document, parses it, and stores the root Document
|
||||||
|
// node, ready to be manipulated.
|
||||||
|
//
|
||||||
|
// Deprecated: Use the net/http standard library package to make the request
|
||||||
|
// and validate the response before calling goquery.NewDocumentFromReader
|
||||||
|
// with the response's body.
|
||||||
|
func NewDocument(url string) (*Document, error) {
|
||||||
|
// Load the URL
|
||||||
|
res, e := http.Get(url)
|
||||||
|
if e != nil {
|
||||||
|
return nil, e
|
||||||
|
}
|
||||||
|
return NewDocumentFromResponse(res)
|
||||||
|
}
|
||||||
|
|
||||||
|
// NewDocumentFromReader returns a Document from an io.Reader.
|
||||||
|
// It returns an error as second value if the reader's data cannot be parsed
|
||||||
|
// as html. It does not check if the reader is also an io.Closer, the
|
||||||
|
// provided reader is never closed by this call. It is the responsibility
|
||||||
|
// of the caller to close it if required.
|
||||||
|
func NewDocumentFromReader(r io.Reader) (*Document, error) {
|
||||||
|
root, e := html.Parse(r)
|
||||||
|
if e != nil {
|
||||||
|
return nil, e
|
||||||
|
}
|
||||||
|
return newDocument(root, nil), nil
|
||||||
|
}
|
||||||
|
|
||||||
|
// NewDocumentFromResponse is another Document constructor that takes an http response as argument.
|
||||||
|
// It loads the specified response's document, parses it, and stores the root Document
|
||||||
|
// node, ready to be manipulated. The response's body is closed on return.
|
||||||
|
//
|
||||||
|
// Deprecated: Use goquery.NewDocumentFromReader with the response's body.
|
||||||
|
func NewDocumentFromResponse(res *http.Response) (*Document, error) {
|
||||||
|
if res == nil {
|
||||||
|
return nil, errors.New("Response is nil")
|
||||||
|
}
|
||||||
|
defer res.Body.Close()
|
||||||
|
if res.Request == nil {
|
||||||
|
return nil, errors.New("Response.Request is nil")
|
||||||
|
}
|
||||||
|
|
||||||
|
// Parse the HTML into nodes
|
||||||
|
root, e := html.Parse(res.Body)
|
||||||
|
if e != nil {
|
||||||
|
return nil, e
|
||||||
|
}
|
||||||
|
|
||||||
|
// Create and fill the document
|
||||||
|
return newDocument(root, res.Request.URL), nil
|
||||||
|
}
|
||||||
|
|
||||||
|
// CloneDocument creates a deep-clone of a document.
|
||||||
|
func CloneDocument(doc *Document) *Document {
|
||||||
|
return newDocument(cloneNode(doc.rootNode), doc.Url)
|
||||||
|
}
|
||||||
|
|
||||||
|
// Private constructor, make sure all fields are correctly filled.
|
||||||
|
func newDocument(root *html.Node, url *url.URL) *Document {
|
||||||
|
// Create and fill the document
|
||||||
|
d := &Document{nil, url, root}
|
||||||
|
d.Selection = newSingleSelection(root, d)
|
||||||
|
return d
|
||||||
|
}
|
||||||
|
|
||||||
|
// Selection represents a collection of nodes matching some criteria. The
|
||||||
|
// initial Selection can be created by using Document.Find, and then
|
||||||
|
// manipulated using the jQuery-like chainable syntax and methods.
|
||||||
|
type Selection struct {
|
||||||
|
Nodes []*html.Node
|
||||||
|
document *Document
|
||||||
|
prevSel *Selection
|
||||||
|
}
|
||||||
|
|
||||||
|
// Helper constructor to create an empty selection
|
||||||
|
func newEmptySelection(doc *Document) *Selection {
|
||||||
|
return &Selection{nil, doc, nil}
|
||||||
|
}
|
||||||
|
|
||||||
|
// Helper constructor to create a selection of only one node
|
||||||
|
func newSingleSelection(node *html.Node, doc *Document) *Selection {
|
||||||
|
return &Selection{[]*html.Node{node}, doc, nil}
|
||||||
|
}
|
||||||
|
|
||||||
|
// Matcher is an interface that defines the methods to match
|
||||||
|
// HTML nodes against a compiled selector string. Cascadia's
|
||||||
|
// Selector implements this interface.
|
||||||
|
type Matcher interface {
|
||||||
|
Match(*html.Node) bool
|
||||||
|
MatchAll(*html.Node) []*html.Node
|
||||||
|
Filter([]*html.Node) []*html.Node
|
||||||
|
}
|
||||||
|
|
||||||
|
// compileMatcher compiles the selector string s and returns
|
||||||
|
// the corresponding Matcher. If s is an invalid selector string,
|
||||||
|
// it returns a Matcher that fails all matches.
|
||||||
|
func compileMatcher(s string) Matcher {
|
||||||
|
cs, err := cascadia.Compile(s)
|
||||||
|
if err != nil {
|
||||||
|
return invalidMatcher{}
|
||||||
|
}
|
||||||
|
return cs
|
||||||
|
}
|
||||||
|
|
||||||
|
// invalidMatcher is a Matcher that always fails to match.
|
||||||
|
type invalidMatcher struct{}
|
||||||
|
|
||||||
|
func (invalidMatcher) Match(n *html.Node) bool { return false }
|
||||||
|
func (invalidMatcher) MatchAll(n *html.Node) []*html.Node { return nil }
|
||||||
|
func (invalidMatcher) Filter(ns []*html.Node) []*html.Node { return nil }
|
161
vendor/github.com/PuerkitoBio/goquery/utilities.go
generated
vendored
Normal file
161
vendor/github.com/PuerkitoBio/goquery/utilities.go
generated
vendored
Normal file
|
@ -0,0 +1,161 @@
|
||||||
|
package goquery
|
||||||
|
|
||||||
|
import (
|
||||||
|
"bytes"
|
||||||
|
|
||||||
|
"golang.org/x/net/html"
|
||||||
|
)
|
||||||
|
|
||||||
|
// used to determine if a set (map[*html.Node]bool) should be used
|
||||||
|
// instead of iterating over a slice. The set uses more memory and
|
||||||
|
// is slower than slice iteration for small N.
|
||||||
|
const minNodesForSet = 1000
|
||||||
|
|
||||||
|
var nodeNames = []string{
|
||||||
|
html.ErrorNode: "#error",
|
||||||
|
html.TextNode: "#text",
|
||||||
|
html.DocumentNode: "#document",
|
||||||
|
html.CommentNode: "#comment",
|
||||||
|
}
|
||||||
|
|
||||||
|
// NodeName returns the node name of the first element in the selection.
|
||||||
|
// It tries to behave in a similar way as the DOM's nodeName property
|
||||||
|
// (https://developer.mozilla.org/en-US/docs/Web/API/Node/nodeName).
|
||||||
|
//
|
||||||
|
// Go's net/html package defines the following node types, listed with
|
||||||
|
// the corresponding returned value from this function:
|
||||||
|
//
|
||||||
|
// ErrorNode : #error
|
||||||
|
// TextNode : #text
|
||||||
|
// DocumentNode : #document
|
||||||
|
// ElementNode : the element's tag name
|
||||||
|
// CommentNode : #comment
|
||||||
|
// DoctypeNode : the name of the document type
|
||||||
|
//
|
||||||
|
func NodeName(s *Selection) string {
|
||||||
|
if s.Length() == 0 {
|
||||||
|
return ""
|
||||||
|
}
|
||||||
|
switch n := s.Get(0); n.Type {
|
||||||
|
case html.ElementNode, html.DoctypeNode:
|
||||||
|
return n.Data
|
||||||
|
default:
|
||||||
|
if n.Type >= 0 && int(n.Type) < len(nodeNames) {
|
||||||
|
return nodeNames[n.Type]
|
||||||
|
}
|
||||||
|
return ""
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// OuterHtml returns the outer HTML rendering of the first item in
|
||||||
|
// the selection - that is, the HTML including the first element's
|
||||||
|
// tag and attributes.
|
||||||
|
//
|
||||||
|
// Unlike InnerHtml, this is a function and not a method on the Selection,
|
||||||
|
// because this is not a jQuery method (in javascript-land, this is
|
||||||
|
// a property provided by the DOM).
|
||||||
|
func OuterHtml(s *Selection) (string, error) {
|
||||||
|
var buf bytes.Buffer
|
||||||
|
|
||||||
|
if s.Length() == 0 {
|
||||||
|
return "", nil
|
||||||
|
}
|
||||||
|
n := s.Get(0)
|
||||||
|
if err := html.Render(&buf, n); err != nil {
|
||||||
|
return "", err
|
||||||
|
}
|
||||||
|
return buf.String(), nil
|
||||||
|
}
|
||||||
|
|
||||||
|
// Loop through all container nodes to search for the target node.
|
||||||
|
func sliceContains(container []*html.Node, contained *html.Node) bool {
|
||||||
|
for _, n := range container {
|
||||||
|
if nodeContains(n, contained) {
|
||||||
|
return true
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
return false
|
||||||
|
}
|
||||||
|
|
||||||
|
// Checks if the contained node is within the container node.
|
||||||
|
func nodeContains(container *html.Node, contained *html.Node) bool {
|
||||||
|
// Check if the parent of the contained node is the container node, traversing
|
||||||
|
// upward until the top is reached, or the container is found.
|
||||||
|
for contained = contained.Parent; contained != nil; contained = contained.Parent {
|
||||||
|
if container == contained {
|
||||||
|
return true
|
||||||
|
}
|
||||||
|
}
|
||||||
|
return false
|
||||||
|
}
|
||||||
|
|
||||||
|
// Checks if the target node is in the slice of nodes.
|
||||||
|
func isInSlice(slice []*html.Node, node *html.Node) bool {
|
||||||
|
return indexInSlice(slice, node) > -1
|
||||||
|
}
|
||||||
|
|
||||||
|
// Returns the index of the target node in the slice, or -1.
|
||||||
|
func indexInSlice(slice []*html.Node, node *html.Node) int {
|
||||||
|
if node != nil {
|
||||||
|
for i, n := range slice {
|
||||||
|
if n == node {
|
||||||
|
return i
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
return -1
|
||||||
|
}
|
||||||
|
|
||||||
|
// Appends the new nodes to the target slice, making sure no duplicate is added.
|
||||||
|
// There is no check to the original state of the target slice, so it may still
|
||||||
|
// contain duplicates. The target slice is returned because append() may create
|
||||||
|
// a new underlying array. If targetSet is nil, a local set is created with the
|
||||||
|
// target if len(target) + len(nodes) is greater than minNodesForSet.
|
||||||
|
func appendWithoutDuplicates(target []*html.Node, nodes []*html.Node, targetSet map[*html.Node]bool) []*html.Node {
|
||||||
|
// if there are not that many nodes, don't use the map, faster to just use nested loops
|
||||||
|
// (unless a non-nil targetSet is passed, in which case the caller knows better).
|
||||||
|
if targetSet == nil && len(target)+len(nodes) < minNodesForSet {
|
||||||
|
for _, n := range nodes {
|
||||||
|
if !isInSlice(target, n) {
|
||||||
|
target = append(target, n)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
return target
|
||||||
|
}
|
||||||
|
|
||||||
|
// if a targetSet is passed, then assume it is reliable, otherwise create one
|
||||||
|
// and initialize it with the current target contents.
|
||||||
|
if targetSet == nil {
|
||||||
|
targetSet = make(map[*html.Node]bool, len(target))
|
||||||
|
for _, n := range target {
|
||||||
|
targetSet[n] = true
|
||||||
|
}
|
||||||
|
}
|
||||||
|
for _, n := range nodes {
|
||||||
|
if !targetSet[n] {
|
||||||
|
target = append(target, n)
|
||||||
|
targetSet[n] = true
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
return target
|
||||||
|
}
|
||||||
|
|
||||||
|
// Loop through a selection, returning only those nodes that pass the predicate
|
||||||
|
// function.
|
||||||
|
func grep(sel *Selection, predicate func(i int, s *Selection) bool) (result []*html.Node) {
|
||||||
|
for i, n := range sel.Nodes {
|
||||||
|
if predicate(i, newSingleSelection(n, sel.document)) {
|
||||||
|
result = append(result, n)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
return result
|
||||||
|
}
|
||||||
|
|
||||||
|
// Creates a new Selection object based on the specified nodes, and keeps the
|
||||||
|
// source Selection object on the stack (linked list).
|
||||||
|
func pushStack(fromSel *Selection, nodes []*html.Node) *Selection {
|
||||||
|
result := &Selection{nodes, fromSel.document, fromSel}
|
||||||
|
return result
|
||||||
|
}
|
24
vendor/github.com/andybalholm/cascadia/LICENSE
generated
vendored
Executable file
24
vendor/github.com/andybalholm/cascadia/LICENSE
generated
vendored
Executable file
|
@ -0,0 +1,24 @@
|
||||||
|
Copyright (c) 2011 Andy Balholm. All rights reserved.
|
||||||
|
|
||||||
|
Redistribution and use in source and binary forms, with or without
|
||||||
|
modification, are permitted provided that the following conditions are
|
||||||
|
met:
|
||||||
|
|
||||||
|
* Redistributions of source code must retain the above copyright
|
||||||
|
notice, this list of conditions and the following disclaimer.
|
||||||
|
* Redistributions in binary form must reproduce the above
|
||||||
|
copyright notice, this list of conditions and the following disclaimer
|
||||||
|
in the documentation and/or other materials provided with the
|
||||||
|
distribution.
|
||||||
|
|
||||||
|
THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
|
||||||
|
"AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT
|
||||||
|
LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR
|
||||||
|
A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT
|
||||||
|
OWNER OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL,
|
||||||
|
SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT
|
||||||
|
LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE,
|
||||||
|
DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY
|
||||||
|
THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT
|
||||||
|
(INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
|
||||||
|
OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
|
9
vendor/github.com/andybalholm/cascadia/README.md
generated
vendored
Normal file
9
vendor/github.com/andybalholm/cascadia/README.md
generated
vendored
Normal file
|
@ -0,0 +1,9 @@
|
||||||
|
# cascadia
|
||||||
|
|
||||||
|
[![](https://travis-ci.org/andybalholm/cascadia.svg)](https://travis-ci.org/andybalholm/cascadia)
|
||||||
|
|
||||||
|
The Cascadia package implements CSS selectors for use with the parse trees produced by the html package.
|
||||||
|
|
||||||
|
To test CSS selectors without writing Go code, check out [cascadia](https://github.com/suntong/cascadia) the command line tool, a thin wrapper around this package.
|
||||||
|
|
||||||
|
[Refer to godoc here](https://godoc.org/github.com/andybalholm/cascadia).
|
3
vendor/github.com/andybalholm/cascadia/go.mod
generated
vendored
Normal file
3
vendor/github.com/andybalholm/cascadia/go.mod
generated
vendored
Normal file
|
@ -0,0 +1,3 @@
|
||||||
|
module "github.com/andybalholm/cascadia"
|
||||||
|
|
||||||
|
require "golang.org/x/net" v0.0.0-20180218175443-cbe0f9307d01
|
835
vendor/github.com/andybalholm/cascadia/parser.go
generated
vendored
Normal file
835
vendor/github.com/andybalholm/cascadia/parser.go
generated
vendored
Normal file
|
@ -0,0 +1,835 @@
|
||||||
|
// Package cascadia is an implementation of CSS selectors.
|
||||||
|
package cascadia
|
||||||
|
|
||||||
|
import (
|
||||||
|
"errors"
|
||||||
|
"fmt"
|
||||||
|
"regexp"
|
||||||
|
"strconv"
|
||||||
|
"strings"
|
||||||
|
|
||||||
|
"golang.org/x/net/html"
|
||||||
|
)
|
||||||
|
|
||||||
|
// a parser for CSS selectors
|
||||||
|
type parser struct {
|
||||||
|
s string // the source text
|
||||||
|
i int // the current position
|
||||||
|
}
|
||||||
|
|
||||||
|
// parseEscape parses a backslash escape.
|
||||||
|
func (p *parser) parseEscape() (result string, err error) {
|
||||||
|
if len(p.s) < p.i+2 || p.s[p.i] != '\\' {
|
||||||
|
return "", errors.New("invalid escape sequence")
|
||||||
|
}
|
||||||
|
|
||||||
|
start := p.i + 1
|
||||||
|
c := p.s[start]
|
||||||
|
switch {
|
||||||
|
case c == '\r' || c == '\n' || c == '\f':
|
||||||
|
return "", errors.New("escaped line ending outside string")
|
||||||
|
case hexDigit(c):
|
||||||
|
// unicode escape (hex)
|
||||||
|
var i int
|
||||||
|
for i = start; i < p.i+6 && i < len(p.s) && hexDigit(p.s[i]); i++ {
|
||||||
|
// empty
|
||||||
|
}
|
||||||
|
v, _ := strconv.ParseUint(p.s[start:i], 16, 21)
|
||||||
|
if len(p.s) > i {
|
||||||
|
switch p.s[i] {
|
||||||
|
case '\r':
|
||||||
|
i++
|
||||||
|
if len(p.s) > i && p.s[i] == '\n' {
|
||||||
|
i++
|
||||||
|
}
|
||||||
|
case ' ', '\t', '\n', '\f':
|
||||||
|
i++
|
||||||
|
}
|
||||||
|
}
|
||||||
|
p.i = i
|
||||||
|
return string(rune(v)), nil
|
||||||
|
}
|
||||||
|
|
||||||
|
// Return the literal character after the backslash.
|
||||||
|
result = p.s[start : start+1]
|
||||||
|
p.i += 2
|
||||||
|
return result, nil
|
||||||
|
}
|
||||||
|
|
||||||
|
func hexDigit(c byte) bool {
|
||||||
|
return '0' <= c && c <= '9' || 'a' <= c && c <= 'f' || 'A' <= c && c <= 'F'
|
||||||
|
}
|
||||||
|
|
||||||
|
// nameStart returns whether c can be the first character of an identifier
|
||||||
|
// (not counting an initial hyphen, or an escape sequence).
|
||||||
|
func nameStart(c byte) bool {
|
||||||
|
return 'a' <= c && c <= 'z' || 'A' <= c && c <= 'Z' || c == '_' || c > 127
|
||||||
|
}
|
||||||
|
|
||||||
|
// nameChar returns whether c can be a character within an identifier
|
||||||
|
// (not counting an escape sequence).
|
||||||
|
func nameChar(c byte) bool {
|
||||||
|
return 'a' <= c && c <= 'z' || 'A' <= c && c <= 'Z' || c == '_' || c > 127 ||
|
||||||
|
c == '-' || '0' <= c && c <= '9'
|
||||||
|
}
|
||||||
|
|
||||||
|
// parseIdentifier parses an identifier.
|
||||||
|
func (p *parser) parseIdentifier() (result string, err error) {
|
||||||
|
startingDash := false
|
||||||
|
if len(p.s) > p.i && p.s[p.i] == '-' {
|
||||||
|
startingDash = true
|
||||||
|
p.i++
|
||||||
|
}
|
||||||
|
|
||||||
|
if len(p.s) <= p.i {
|
||||||
|
return "", errors.New("expected identifier, found EOF instead")
|
||||||
|
}
|
||||||
|
|
||||||
|
if c := p.s[p.i]; !(nameStart(c) || c == '\\') {
|
||||||
|
return "", fmt.Errorf("expected identifier, found %c instead", c)
|
||||||
|
}
|
||||||
|
|
||||||
|
result, err = p.parseName()
|
||||||
|
if startingDash && err == nil {
|
||||||
|
result = "-" + result
|
||||||
|
}
|
||||||
|
return
|
||||||
|
}
|
||||||
|
|
||||||
|
// parseName parses a name (which is like an identifier, but doesn't have
|
||||||
|
// extra restrictions on the first character).
|
||||||
|
func (p *parser) parseName() (result string, err error) {
|
||||||
|
i := p.i
|
||||||
|
loop:
|
||||||
|
for i < len(p.s) {
|
||||||
|
c := p.s[i]
|
||||||
|
switch {
|
||||||
|
case nameChar(c):
|
||||||
|
start := i
|
||||||
|
for i < len(p.s) && nameChar(p.s[i]) {
|
||||||
|
i++
|
||||||
|
}
|
||||||
|
result += p.s[start:i]
|
||||||
|
case c == '\\':
|
||||||
|
p.i = i
|
||||||
|
val, err := p.parseEscape()
|
||||||
|
if err != nil {
|
||||||
|
return "", err
|
||||||
|
}
|
||||||
|
i = p.i
|
||||||
|
result += val
|
||||||
|
default:
|
||||||
|
break loop
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
if result == "" {
|
||||||
|
return "", errors.New("expected name, found EOF instead")
|
||||||
|
}
|
||||||
|
|
||||||
|
p.i = i
|
||||||
|
return result, nil
|
||||||
|
}
|
||||||
|
|
||||||
|
// parseString parses a single- or double-quoted string.
|
||||||
|
func (p *parser) parseString() (result string, err error) {
|
||||||
|
i := p.i
|
||||||
|
if len(p.s) < i+2 {
|
||||||
|
return "", errors.New("expected string, found EOF instead")
|
||||||
|
}
|
||||||
|
|
||||||
|
quote := p.s[i]
|
||||||
|
i++
|
||||||
|
|
||||||
|
loop:
|
||||||
|
for i < len(p.s) {
|
||||||
|
switch p.s[i] {
|
||||||
|
case '\\':
|
||||||
|
if len(p.s) > i+1 {
|
||||||
|
switch c := p.s[i+1]; c {
|
||||||
|
case '\r':
|
||||||
|
if len(p.s) > i+2 && p.s[i+2] == '\n' {
|
||||||
|
i += 3
|
||||||
|
continue loop
|
||||||
|
}
|
||||||
|
fallthrough
|
||||||
|
case '\n', '\f':
|
||||||
|
i += 2
|
||||||
|
continue loop
|
||||||
|
}
|
||||||
|
}
|
||||||
|
p.i = i
|
||||||
|
val, err := p.parseEscape()
|
||||||
|
if err != nil {
|
||||||
|
return "", err
|
||||||
|
}
|
||||||
|
i = p.i
|
||||||
|
result += val
|
||||||
|
case quote:
|
||||||
|
break loop
|
||||||
|
case '\r', '\n', '\f':
|
||||||
|
return "", errors.New("unexpected end of line in string")
|
||||||
|
default:
|
||||||
|
start := i
|
||||||
|
for i < len(p.s) {
|
||||||
|
if c := p.s[i]; c == quote || c == '\\' || c == '\r' || c == '\n' || c == '\f' {
|
||||||
|
break
|
||||||
|
}
|
||||||
|
i++
|
||||||
|
}
|
||||||
|
result += p.s[start:i]
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
if i >= len(p.s) {
|
||||||
|
return "", errors.New("EOF in string")
|
||||||
|
}
|
||||||
|
|
||||||
|
// Consume the final quote.
|
||||||
|
i++
|
||||||
|
|
||||||
|
p.i = i
|
||||||
|
return result, nil
|
||||||
|
}
|
||||||
|
|
||||||
|
// parseRegex parses a regular expression; the end is defined by encountering an
|
||||||
|
// unmatched closing ')' or ']' which is not consumed
|
||||||
|
func (p *parser) parseRegex() (rx *regexp.Regexp, err error) {
|
||||||
|
i := p.i
|
||||||
|
if len(p.s) < i+2 {
|
||||||
|
return nil, errors.New("expected regular expression, found EOF instead")
|
||||||
|
}
|
||||||
|
|
||||||
|
// number of open parens or brackets;
|
||||||
|
// when it becomes negative, finished parsing regex
|
||||||
|
open := 0
|
||||||
|
|
||||||
|
loop:
|
||||||
|
for i < len(p.s) {
|
||||||
|
switch p.s[i] {
|
||||||
|
case '(', '[':
|
||||||
|
open++
|
||||||
|
case ')', ']':
|
||||||
|
open--
|
||||||
|
if open < 0 {
|
||||||
|
break loop
|
||||||
|
}
|
||||||
|
}
|
||||||
|
i++
|
||||||
|
}
|
||||||
|
|
||||||
|
if i >= len(p.s) {
|
||||||
|
return nil, errors.New("EOF in regular expression")
|
||||||
|
}
|
||||||
|
rx, err = regexp.Compile(p.s[p.i:i])
|
||||||
|
p.i = i
|
||||||
|
return rx, err
|
||||||
|
}
|
||||||
|
|
||||||
|
// skipWhitespace consumes whitespace characters and comments.
|
||||||
|
// It returns true if there was actually anything to skip.
|
||||||
|
func (p *parser) skipWhitespace() bool {
|
||||||
|
i := p.i
|
||||||
|
for i < len(p.s) {
|
||||||
|
switch p.s[i] {
|
||||||
|
case ' ', '\t', '\r', '\n', '\f':
|
||||||
|
i++
|
||||||
|
continue
|
||||||
|
case '/':
|
||||||
|
if strings.HasPrefix(p.s[i:], "/*") {
|
||||||
|
end := strings.Index(p.s[i+len("/*"):], "*/")
|
||||||
|
if end != -1 {
|
||||||
|
i += end + len("/**/")
|
||||||
|
continue
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
break
|
||||||
|
}
|
||||||
|
|
||||||
|
if i > p.i {
|
||||||
|
p.i = i
|
||||||
|
return true
|
||||||
|
}
|
||||||
|
|
||||||
|
return false
|
||||||
|
}
|
||||||
|
|
||||||
|
// consumeParenthesis consumes an opening parenthesis and any following
|
||||||
|
// whitespace. It returns true if there was actually a parenthesis to skip.
|
||||||
|
func (p *parser) consumeParenthesis() bool {
|
||||||
|
if p.i < len(p.s) && p.s[p.i] == '(' {
|
||||||
|
p.i++
|
||||||
|
p.skipWhitespace()
|
||||||
|
return true
|
||||||
|
}
|
||||||
|
return false
|
||||||
|
}
|
||||||
|
|
||||||
|
// consumeClosingParenthesis consumes a closing parenthesis and any preceding
|
||||||
|
// whitespace. It returns true if there was actually a parenthesis to skip.
|
||||||
|
func (p *parser) consumeClosingParenthesis() bool {
|
||||||
|
i := p.i
|
||||||
|
p.skipWhitespace()
|
||||||
|
if p.i < len(p.s) && p.s[p.i] == ')' {
|
||||||
|
p.i++
|
||||||
|
return true
|
||||||
|
}
|
||||||
|
p.i = i
|
||||||
|
return false
|
||||||
|
}
|
||||||
|
|
||||||
|
// parseTypeSelector parses a type selector (one that matches by tag name).
|
||||||
|
func (p *parser) parseTypeSelector() (result Selector, err error) {
|
||||||
|
tag, err := p.parseIdentifier()
|
||||||
|
if err != nil {
|
||||||
|
return nil, err
|
||||||
|
}
|
||||||
|
|
||||||
|
return typeSelector(tag), nil
|
||||||
|
}
|
||||||
|
|
||||||
|
// parseIDSelector parses a selector that matches by id attribute.
|
||||||
|
func (p *parser) parseIDSelector() (Selector, error) {
|
||||||
|
if p.i >= len(p.s) {
|
||||||
|
return nil, fmt.Errorf("expected id selector (#id), found EOF instead")
|
||||||
|
}
|
||||||
|
if p.s[p.i] != '#' {
|
||||||
|
return nil, fmt.Errorf("expected id selector (#id), found '%c' instead", p.s[p.i])
|
||||||
|
}
|
||||||
|
|
||||||
|
p.i++
|
||||||
|
id, err := p.parseName()
|
||||||
|
if err != nil {
|
||||||
|
return nil, err
|
||||||
|
}
|
||||||
|
|
||||||
|
return attributeEqualsSelector("id", id), nil
|
||||||
|
}
|
||||||
|
|
||||||
|
// parseClassSelector parses a selector that matches by class attribute.
|
||||||
|
func (p *parser) parseClassSelector() (Selector, error) {
|
||||||
|
if p.i >= len(p.s) {
|
||||||
|
return nil, fmt.Errorf("expected class selector (.class), found EOF instead")
|
||||||
|
}
|
||||||
|
if p.s[p.i] != '.' {
|
||||||
|
return nil, fmt.Errorf("expected class selector (.class), found '%c' instead", p.s[p.i])
|
||||||
|
}
|
||||||
|
|
||||||
|
p.i++
|
||||||
|
class, err := p.parseIdentifier()
|
||||||
|
if err != nil {
|
||||||
|
return nil, err
|
||||||
|
}
|
||||||
|
|
||||||
|
return attributeIncludesSelector("class", class), nil
|
||||||
|
}
|
||||||
|
|
||||||
|
// parseAttributeSelector parses a selector that matches by attribute value.
|
||||||
|
func (p *parser) parseAttributeSelector() (Selector, error) {
|
||||||
|
if p.i >= len(p.s) {
|
||||||
|
return nil, fmt.Errorf("expected attribute selector ([attribute]), found EOF instead")
|
||||||
|
}
|
||||||
|
if p.s[p.i] != '[' {
|
||||||
|
return nil, fmt.Errorf("expected attribute selector ([attribute]), found '%c' instead", p.s[p.i])
|
||||||
|
}
|
||||||
|
|
||||||
|
p.i++
|
||||||
|
p.skipWhitespace()
|
||||||
|
key, err := p.parseIdentifier()
|
||||||
|
if err != nil {
|
||||||
|
return nil, err
|
||||||
|
}
|
||||||
|
|
||||||
|
p.skipWhitespace()
|
||||||
|
if p.i >= len(p.s) {
|
||||||
|
return nil, errors.New("unexpected EOF in attribute selector")
|
||||||
|
}
|
||||||
|
|
||||||
|
if p.s[p.i] == ']' {
|
||||||
|
p.i++
|
||||||
|
return attributeExistsSelector(key), nil
|
||||||
|
}
|
||||||
|
|
||||||
|
if p.i+2 >= len(p.s) {
|
||||||
|
return nil, errors.New("unexpected EOF in attribute selector")
|
||||||
|
}
|
||||||
|
|
||||||
|
op := p.s[p.i : p.i+2]
|
||||||
|
if op[0] == '=' {
|
||||||
|
op = "="
|
||||||
|
} else if op[1] != '=' {
|
||||||
|
return nil, fmt.Errorf(`expected equality operator, found "%s" instead`, op)
|
||||||
|
}
|
||||||
|
p.i += len(op)
|
||||||
|
|
||||||
|
p.skipWhitespace()
|
||||||
|
if p.i >= len(p.s) {
|
||||||
|
return nil, errors.New("unexpected EOF in attribute selector")
|
||||||
|
}
|
||||||
|
var val string
|
||||||
|
var rx *regexp.Regexp
|
||||||
|
if op == "#=" {
|
||||||
|
rx, err = p.parseRegex()
|
||||||
|
} else {
|
||||||
|
switch p.s[p.i] {
|
||||||
|
case '\'', '"':
|
||||||
|
val, err = p.parseString()
|
||||||
|
default:
|
||||||
|
val, err = p.parseIdentifier()
|
||||||
|
}
|
||||||
|
}
|
||||||
|
if err != nil {
|
||||||
|
return nil, err
|
||||||
|
}
|
||||||
|
|
||||||
|
p.skipWhitespace()
|
||||||
|
if p.i >= len(p.s) {
|
||||||
|
return nil, errors.New("unexpected EOF in attribute selector")
|
||||||
|
}
|
||||||
|
if p.s[p.i] != ']' {
|
||||||
|
return nil, fmt.Errorf("expected ']', found '%c' instead", p.s[p.i])
|
||||||
|
}
|
||||||
|
p.i++
|
||||||
|
|
||||||
|
switch op {
|
||||||
|
case "=":
|
||||||
|
return attributeEqualsSelector(key, val), nil
|
||||||
|
case "!=":
|
||||||
|
return attributeNotEqualSelector(key, val), nil
|
||||||
|
case "~=":
|
||||||
|
return attributeIncludesSelector(key, val), nil
|
||||||
|
case "|=":
|
||||||
|
return attributeDashmatchSelector(key, val), nil
|
||||||
|
case "^=":
|
||||||
|
return attributePrefixSelector(key, val), nil
|
||||||
|
case "$=":
|
||||||
|
return attributeSuffixSelector(key, val), nil
|
||||||
|
case "*=":
|
||||||
|
return attributeSubstringSelector(key, val), nil
|
||||||
|
case "#=":
|
||||||
|
return attributeRegexSelector(key, rx), nil
|
||||||
|
}
|
||||||
|
|
||||||
|
return nil, fmt.Errorf("attribute operator %q is not supported", op)
|
||||||
|
}
|
||||||
|
|
||||||
|
var errExpectedParenthesis = errors.New("expected '(' but didn't find it")
|
||||||
|
var errExpectedClosingParenthesis = errors.New("expected ')' but didn't find it")
|
||||||
|
var errUnmatchedParenthesis = errors.New("unmatched '('")
|
||||||
|
|
||||||
|
// parsePseudoclassSelector parses a pseudoclass selector like :not(p).
|
||||||
|
func (p *parser) parsePseudoclassSelector() (Selector, error) {
|
||||||
|
if p.i >= len(p.s) {
|
||||||
|
return nil, fmt.Errorf("expected pseudoclass selector (:pseudoclass), found EOF instead")
|
||||||
|
}
|
||||||
|
if p.s[p.i] != ':' {
|
||||||
|
return nil, fmt.Errorf("expected attribute selector (:pseudoclass), found '%c' instead", p.s[p.i])
|
||||||
|
}
|
||||||
|
|
||||||
|
p.i++
|
||||||
|
name, err := p.parseIdentifier()
|
||||||
|
if err != nil {
|
||||||
|
return nil, err
|
||||||
|
}
|
||||||
|
name = toLowerASCII(name)
|
||||||
|
|
||||||
|
switch name {
|
||||||
|
case "not", "has", "haschild":
|
||||||
|
if !p.consumeParenthesis() {
|
||||||
|
return nil, errExpectedParenthesis
|
||||||
|
}
|
||||||
|
sel, parseErr := p.parseSelectorGroup()
|
||||||
|
if parseErr != nil {
|
||||||
|
return nil, parseErr
|
||||||
|
}
|
||||||
|
if !p.consumeClosingParenthesis() {
|
||||||
|
return nil, errExpectedClosingParenthesis
|
||||||
|
}
|
||||||
|
|
||||||
|
switch name {
|
||||||
|
case "not":
|
||||||
|
return negatedSelector(sel), nil
|
||||||
|
case "has":
|
||||||
|
return hasDescendantSelector(sel), nil
|
||||||
|
case "haschild":
|
||||||
|
return hasChildSelector(sel), nil
|
||||||
|
}
|
||||||
|
|
||||||
|
case "contains", "containsown":
|
||||||
|
if !p.consumeParenthesis() {
|
||||||
|
return nil, errExpectedParenthesis
|
||||||
|
}
|
||||||
|
if p.i == len(p.s) {
|
||||||
|
return nil, errUnmatchedParenthesis
|
||||||
|
}
|
||||||
|
var val string
|
||||||
|
switch p.s[p.i] {
|
||||||
|
case '\'', '"':
|
||||||
|
val, err = p.parseString()
|
||||||
|
default:
|
||||||
|
val, err = p.parseIdentifier()
|
||||||
|
}
|
||||||
|
if err != nil {
|
||||||
|
return nil, err
|
||||||
|
}
|
||||||
|
val = strings.ToLower(val)
|
||||||
|
p.skipWhitespace()
|
||||||
|
if p.i >= len(p.s) {
|
||||||
|
return nil, errors.New("unexpected EOF in pseudo selector")
|
||||||
|
}
|
||||||
|
if !p.consumeClosingParenthesis() {
|
||||||
|
return nil, errExpectedClosingParenthesis
|
||||||
|
}
|
||||||
|
|
||||||
|
switch name {
|
||||||
|
case "contains":
|
||||||
|
return textSubstrSelector(val), nil
|
||||||
|
case "containsown":
|
||||||
|
return ownTextSubstrSelector(val), nil
|
||||||
|
}
|
||||||
|
|
||||||
|
case "matches", "matchesown":
|
||||||
|
if !p.consumeParenthesis() {
|
||||||
|
return nil, errExpectedParenthesis
|
||||||
|
}
|
||||||
|
rx, err := p.parseRegex()
|
||||||
|
if err != nil {
|
||||||
|
return nil, err
|
||||||
|
}
|
||||||
|
if p.i >= len(p.s) {
|
||||||
|
return nil, errors.New("unexpected EOF in pseudo selector")
|
||||||
|
}
|
||||||
|
if !p.consumeClosingParenthesis() {
|
||||||
|
return nil, errExpectedClosingParenthesis
|
||||||
|
}
|
||||||
|
|
||||||
|
switch name {
|
||||||
|
case "matches":
|
||||||
|
return textRegexSelector(rx), nil
|
||||||
|
case "matchesown":
|
||||||
|
return ownTextRegexSelector(rx), nil
|
||||||
|
}
|
||||||
|
|
||||||
|
case "nth-child", "nth-last-child", "nth-of-type", "nth-last-of-type":
|
||||||
|
if !p.consumeParenthesis() {
|
||||||
|
return nil, errExpectedParenthesis
|
||||||
|
}
|
||||||
|
a, b, err := p.parseNth()
|
||||||
|
if err != nil {
|
||||||
|
return nil, err
|
||||||
|
}
|
||||||
|
if !p.consumeClosingParenthesis() {
|
||||||
|
return nil, errExpectedClosingParenthesis
|
||||||
|
}
|
||||||
|
if a == 0 {
|
||||||
|
switch name {
|
||||||
|
case "nth-child":
|
||||||
|
return simpleNthChildSelector(b, false), nil
|
||||||
|
case "nth-of-type":
|
||||||
|
return simpleNthChildSelector(b, true), nil
|
||||||
|
case "nth-last-child":
|
||||||
|
return simpleNthLastChildSelector(b, false), nil
|
||||||
|
case "nth-last-of-type":
|
||||||
|
return simpleNthLastChildSelector(b, true), nil
|
||||||
|
}
|
||||||
|
}
|
||||||
|
return nthChildSelector(a, b,
|
||||||
|
name == "nth-last-child" || name == "nth-last-of-type",
|
||||||
|
name == "nth-of-type" || name == "nth-last-of-type"),
|
||||||
|
nil
|
||||||
|
|
||||||
|
case "first-child":
|
||||||
|
return simpleNthChildSelector(1, false), nil
|
||||||
|
case "last-child":
|
||||||
|
return simpleNthLastChildSelector(1, false), nil
|
||||||
|
case "first-of-type":
|
||||||
|
return simpleNthChildSelector(1, true), nil
|
||||||
|
case "last-of-type":
|
||||||
|
return simpleNthLastChildSelector(1, true), nil
|
||||||
|
case "only-child":
|
||||||
|
return onlyChildSelector(false), nil
|
||||||
|
case "only-of-type":
|
||||||
|
return onlyChildSelector(true), nil
|
||||||
|
case "input":
|
||||||
|
return inputSelector, nil
|
||||||
|
case "empty":
|
||||||
|
return emptyElementSelector, nil
|
||||||
|
case "root":
|
||||||
|
return rootSelector, nil
|
||||||
|
}
|
||||||
|
|
||||||
|
return nil, fmt.Errorf("unknown pseudoclass :%s", name)
|
||||||
|
}
|
||||||
|
|
||||||
|
// parseInteger parses a decimal integer.
|
||||||
|
func (p *parser) parseInteger() (int, error) {
|
||||||
|
i := p.i
|
||||||
|
start := i
|
||||||
|
for i < len(p.s) && '0' <= p.s[i] && p.s[i] <= '9' {
|
||||||
|
i++
|
||||||
|
}
|
||||||
|
if i == start {
|
||||||
|
return 0, errors.New("expected integer, but didn't find it")
|
||||||
|
}
|
||||||
|
p.i = i
|
||||||
|
|
||||||
|
val, err := strconv.Atoi(p.s[start:i])
|
||||||
|
if err != nil {
|
||||||
|
return 0, err
|
||||||
|
}
|
||||||
|
|
||||||
|
return val, nil
|
||||||
|
}
|
||||||
|
|
||||||
|
// parseNth parses the argument for :nth-child (normally of the form an+b).
|
||||||
|
func (p *parser) parseNth() (a, b int, err error) {
|
||||||
|
// initial state
|
||||||
|
if p.i >= len(p.s) {
|
||||||
|
goto eof
|
||||||
|
}
|
||||||
|
switch p.s[p.i] {
|
||||||
|
case '-':
|
||||||
|
p.i++
|
||||||
|
goto negativeA
|
||||||
|
case '+':
|
||||||
|
p.i++
|
||||||
|
goto positiveA
|
||||||
|
case '0', '1', '2', '3', '4', '5', '6', '7', '8', '9':
|
||||||
|
goto positiveA
|
||||||
|
case 'n', 'N':
|
||||||
|
a = 1
|
||||||
|
p.i++
|
||||||
|
goto readN
|
||||||
|
case 'o', 'O', 'e', 'E':
|
||||||
|
id, nameErr := p.parseName()
|
||||||
|
if nameErr != nil {
|
||||||
|
return 0, 0, nameErr
|
||||||
|
}
|
||||||
|
id = toLowerASCII(id)
|
||||||
|
if id == "odd" {
|
||||||
|
return 2, 1, nil
|
||||||
|
}
|
||||||
|
if id == "even" {
|
||||||
|
return 2, 0, nil
|
||||||
|
}
|
||||||
|
return 0, 0, fmt.Errorf("expected 'odd' or 'even', but found '%s' instead", id)
|
||||||
|
default:
|
||||||
|
goto invalid
|
||||||
|
}
|
||||||
|
|
||||||
|
positiveA:
|
||||||
|
if p.i >= len(p.s) {
|
||||||
|
goto eof
|
||||||
|
}
|
||||||
|
switch p.s[p.i] {
|
||||||
|
case '0', '1', '2', '3', '4', '5', '6', '7', '8', '9':
|
||||||
|
a, err = p.parseInteger()
|
||||||
|
if err != nil {
|
||||||
|
return 0, 0, err
|
||||||
|
}
|
||||||
|
goto readA
|
||||||
|
case 'n', 'N':
|
||||||
|
a = 1
|
||||||
|
p.i++
|
||||||
|
goto readN
|
||||||
|
default:
|
||||||
|
goto invalid
|
||||||
|
}
|
||||||
|
|
||||||
|
negativeA:
|
||||||
|
if p.i >= len(p.s) {
|
||||||
|
goto eof
|
||||||
|
}
|
||||||
|
switch p.s[p.i] {
|
||||||
|
case '0', '1', '2', '3', '4', '5', '6', '7', '8', '9':
|
||||||
|
a, err = p.parseInteger()
|
||||||
|
if err != nil {
|
||||||
|
return 0, 0, err
|
||||||
|
}
|
||||||
|
a = -a
|
||||||
|
goto readA
|
||||||
|
case 'n', 'N':
|
||||||
|
a = -1
|
||||||
|
p.i++
|
||||||
|
goto readN
|
||||||
|
default:
|
||||||
|
goto invalid
|
||||||
|
}
|
||||||
|
|
||||||
|
readA:
|
||||||
|
if p.i >= len(p.s) {
|
||||||
|
goto eof
|
||||||
|
}
|
||||||
|
switch p.s[p.i] {
|
||||||
|
case 'n', 'N':
|
||||||
|
p.i++
|
||||||
|
goto readN
|
||||||
|
default:
|
||||||
|
// The number we read as a is actually b.
|
||||||
|
return 0, a, nil
|
||||||
|
}
|
||||||
|
|
||||||
|
readN:
|
||||||
|
p.skipWhitespace()
|
||||||
|
if p.i >= len(p.s) {
|
||||||
|
goto eof
|
||||||
|
}
|
||||||
|
switch p.s[p.i] {
|
||||||
|
case '+':
|
||||||
|
p.i++
|
||||||
|
p.skipWhitespace()
|
||||||
|
b, err = p.parseInteger()
|
||||||
|
if err != nil {
|
||||||
|
return 0, 0, err
|
||||||
|
}
|
||||||
|
return a, b, nil
|
||||||
|
case '-':
|
||||||
|
p.i++
|
||||||
|
p.skipWhitespace()
|
||||||
|
b, err = p.parseInteger()
|
||||||
|
if err != nil {
|
||||||
|
return 0, 0, err
|
||||||
|
}
|
||||||
|
return a, -b, nil
|
||||||
|
default:
|
||||||
|
return a, 0, nil
|
||||||
|
}
|
||||||
|
|
||||||
|
eof:
|
||||||
|
return 0, 0, errors.New("unexpected EOF while attempting to parse expression of form an+b")
|
||||||
|
|
||||||
|
invalid:
|
||||||
|
return 0, 0, errors.New("unexpected character while attempting to parse expression of form an+b")
|
||||||
|
}
|
||||||
|
|
||||||
|
// parseSimpleSelectorSequence parses a selector sequence that applies to
|
||||||
|
// a single element.
|
||||||
|
func (p *parser) parseSimpleSelectorSequence() (Selector, error) {
|
||||||
|
var result Selector
|
||||||
|
|
||||||
|
if p.i >= len(p.s) {
|
||||||
|
return nil, errors.New("expected selector, found EOF instead")
|
||||||
|
}
|
||||||
|
|
||||||
|
switch p.s[p.i] {
|
||||||
|
case '*':
|
||||||
|
// It's the universal selector. Just skip over it, since it doesn't affect the meaning.
|
||||||
|
p.i++
|
||||||
|
case '#', '.', '[', ':':
|
||||||
|
// There's no type selector. Wait to process the other till the main loop.
|
||||||
|
default:
|
||||||
|
r, err := p.parseTypeSelector()
|
||||||
|
if err != nil {
|
||||||
|
return nil, err
|
||||||
|
}
|
||||||
|
result = r
|
||||||
|
}
|
||||||
|
|
||||||
|
loop:
|
||||||
|
for p.i < len(p.s) {
|
||||||
|
var ns Selector
|
||||||
|
var err error
|
||||||
|
switch p.s[p.i] {
|
||||||
|
case '#':
|
||||||
|
ns, err = p.parseIDSelector()
|
||||||
|
case '.':
|
||||||
|
ns, err = p.parseClassSelector()
|
||||||
|
case '[':
|
||||||
|
ns, err = p.parseAttributeSelector()
|
||||||
|
case ':':
|
||||||
|
ns, err = p.parsePseudoclassSelector()
|
||||||
|
default:
|
||||||
|
break loop
|
||||||
|
}
|
||||||
|
if err != nil {
|
||||||
|
return nil, err
|
||||||
|
}
|
||||||
|
if result == nil {
|
||||||
|
result = ns
|
||||||
|
} else {
|
||||||
|
result = intersectionSelector(result, ns)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
if result == nil {
|
||||||
|
result = func(n *html.Node) bool {
|
||||||
|
return n.Type == html.ElementNode
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
return result, nil
|
||||||
|
}
|
||||||
|
|
||||||
|
// parseSelector parses a selector that may include combinators.
|
||||||
|
func (p *parser) parseSelector() (result Selector, err error) {
|
||||||
|
p.skipWhitespace()
|
||||||
|
result, err = p.parseSimpleSelectorSequence()
|
||||||
|
if err != nil {
|
||||||
|
return
|
||||||
|
}
|
||||||
|
|
||||||
|
for {
|
||||||
|
var combinator byte
|
||||||
|
if p.skipWhitespace() {
|
||||||
|
combinator = ' '
|
||||||
|
}
|
||||||
|
if p.i >= len(p.s) {
|
||||||
|
return
|
||||||
|
}
|
||||||
|
|
||||||
|
switch p.s[p.i] {
|
||||||
|
case '+', '>', '~':
|
||||||
|
combinator = p.s[p.i]
|
||||||
|
p.i++
|
||||||
|
p.skipWhitespace()
|
||||||
|
case ',', ')':
|
||||||
|
// These characters can't begin a selector, but they can legally occur after one.
|
||||||
|
return
|
||||||
|
}
|
||||||
|
|
||||||
|
if combinator == 0 {
|
||||||
|
return
|
||||||
|
}
|
||||||
|
|
||||||
|
c, err := p.parseSimpleSelectorSequence()
|
||||||
|
if err != nil {
|
||||||
|
return nil, err
|
||||||
|
}
|
||||||
|
|
||||||
|
switch combinator {
|
||||||
|
case ' ':
|
||||||
|
result = descendantSelector(result, c)
|
||||||
|
case '>':
|
||||||
|
result = childSelector(result, c)
|
||||||
|
case '+':
|
||||||
|
result = siblingSelector(result, c, true)
|
||||||
|
case '~':
|
||||||
|
result = siblingSelector(result, c, false)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
panic("unreachable")
|
||||||
|
}
|
||||||
|
|
||||||
|
// parseSelectorGroup parses a group of selectors, separated by commas.
|
||||||
|
func (p *parser) parseSelectorGroup() (result Selector, err error) {
|
||||||
|
result, err = p.parseSelector()
|
||||||
|
if err != nil {
|
||||||
|
return
|
||||||
|
}
|
||||||
|
|
||||||
|
for p.i < len(p.s) {
|
||||||
|
if p.s[p.i] != ',' {
|
||||||
|
return result, nil
|
||||||
|
}
|
||||||
|
p.i++
|
||||||
|
c, err := p.parseSelector()
|
||||||
|
if err != nil {
|
||||||
|
return nil, err
|
||||||
|
}
|
||||||
|
result = unionSelector(result, c)
|
||||||
|
}
|
||||||
|
|
||||||
|
return
|
||||||
|
}
|
622
vendor/github.com/andybalholm/cascadia/selector.go
generated
vendored
Normal file
622
vendor/github.com/andybalholm/cascadia/selector.go
generated
vendored
Normal file
|
@ -0,0 +1,622 @@
|
||||||
|
package cascadia
|
||||||
|
|
||||||
|
import (
|
||||||
|
"bytes"
|
||||||
|
"fmt"
|
||||||
|
"regexp"
|
||||||
|
"strings"
|
||||||
|
|
||||||
|
"golang.org/x/net/html"
|
||||||
|
)
|
||||||
|
|
||||||
|
// the Selector type, and functions for creating them
|
||||||
|
|
||||||
|
// A Selector is a function which tells whether a node matches or not.
|
||||||
|
type Selector func(*html.Node) bool
|
||||||
|
|
||||||
|
// hasChildMatch returns whether n has any child that matches a.
|
||||||
|
func hasChildMatch(n *html.Node, a Selector) bool {
|
||||||
|
for c := n.FirstChild; c != nil; c = c.NextSibling {
|
||||||
|
if a(c) {
|
||||||
|
return true
|
||||||
|
}
|
||||||
|
}
|
||||||
|
return false
|
||||||
|
}
|
||||||
|
|
||||||
|
// hasDescendantMatch performs a depth-first search of n's descendants,
|
||||||
|
// testing whether any of them match a. It returns true as soon as a match is
|
||||||
|
// found, or false if no match is found.
|
||||||
|
func hasDescendantMatch(n *html.Node, a Selector) bool {
|
||||||
|
for c := n.FirstChild; c != nil; c = c.NextSibling {
|
||||||
|
if a(c) || (c.Type == html.ElementNode && hasDescendantMatch(c, a)) {
|
||||||
|
return true
|
||||||
|
}
|
||||||
|
}
|
||||||
|
return false
|
||||||
|
}
|
||||||
|
|
||||||
|
// Compile parses a selector and returns, if successful, a Selector object
|
||||||
|
// that can be used to match against html.Node objects.
|
||||||
|
func Compile(sel string) (Selector, error) {
|
||||||
|
p := &parser{s: sel}
|
||||||
|
compiled, err := p.parseSelectorGroup()
|
||||||
|
if err != nil {
|
||||||
|
return nil, err
|
||||||
|
}
|
||||||
|
|
||||||
|
if p.i < len(sel) {
|
||||||
|
return nil, fmt.Errorf("parsing %q: %d bytes left over", sel, len(sel)-p.i)
|
||||||
|
}
|
||||||
|
|
||||||
|
return compiled, nil
|
||||||
|
}
|
||||||
|
|
||||||
|
// MustCompile is like Compile, but panics instead of returning an error.
|
||||||
|
func MustCompile(sel string) Selector {
|
||||||
|
compiled, err := Compile(sel)
|
||||||
|
if err != nil {
|
||||||
|
panic(err)
|
||||||
|
}
|
||||||
|
return compiled
|
||||||
|
}
|
||||||
|
|
||||||
|
// MatchAll returns a slice of the nodes that match the selector,
|
||||||
|
// from n and its children.
|
||||||
|
func (s Selector) MatchAll(n *html.Node) []*html.Node {
|
||||||
|
return s.matchAllInto(n, nil)
|
||||||
|
}
|
||||||
|
|
||||||
|
func (s Selector) matchAllInto(n *html.Node, storage []*html.Node) []*html.Node {
|
||||||
|
if s(n) {
|
||||||
|
storage = append(storage, n)
|
||||||
|
}
|
||||||
|
|
||||||
|
for child := n.FirstChild; child != nil; child = child.NextSibling {
|
||||||
|
storage = s.matchAllInto(child, storage)
|
||||||
|
}
|
||||||
|
|
||||||
|
return storage
|
||||||
|
}
|
||||||
|
|
||||||
|
// Match returns true if the node matches the selector.
|
||||||
|
func (s Selector) Match(n *html.Node) bool {
|
||||||
|
return s(n)
|
||||||
|
}
|
||||||
|
|
||||||
|
// MatchFirst returns the first node that matches s, from n and its children.
|
||||||
|
func (s Selector) MatchFirst(n *html.Node) *html.Node {
|
||||||
|
if s.Match(n) {
|
||||||
|
return n
|
||||||
|
}
|
||||||
|
|
||||||
|
for c := n.FirstChild; c != nil; c = c.NextSibling {
|
||||||
|
m := s.MatchFirst(c)
|
||||||
|
if m != nil {
|
||||||
|
return m
|
||||||
|
}
|
||||||
|
}
|
||||||
|
return nil
|
||||||
|
}
|
||||||
|
|
||||||
|
// Filter returns the nodes in nodes that match the selector.
|
||||||
|
func (s Selector) Filter(nodes []*html.Node) (result []*html.Node) {
|
||||||
|
for _, n := range nodes {
|
||||||
|
if s(n) {
|
||||||
|
result = append(result, n)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
return result
|
||||||
|
}
|
||||||
|
|
||||||
|
// typeSelector returns a Selector that matches elements with a given tag name.
|
||||||
|
func typeSelector(tag string) Selector {
|
||||||
|
tag = toLowerASCII(tag)
|
||||||
|
return func(n *html.Node) bool {
|
||||||
|
return n.Type == html.ElementNode && n.Data == tag
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// toLowerASCII returns s with all ASCII capital letters lowercased.
|
||||||
|
func toLowerASCII(s string) string {
|
||||||
|
var b []byte
|
||||||
|
for i := 0; i < len(s); i++ {
|
||||||
|
if c := s[i]; 'A' <= c && c <= 'Z' {
|
||||||
|
if b == nil {
|
||||||
|
b = make([]byte, len(s))
|
||||||
|
copy(b, s)
|
||||||
|
}
|
||||||
|
b[i] = s[i] + ('a' - 'A')
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
if b == nil {
|
||||||
|
return s
|
||||||
|
}
|
||||||
|
|
||||||
|
return string(b)
|
||||||
|
}
|
||||||
|
|
||||||
|
// attributeSelector returns a Selector that matches elements
|
||||||
|
// where the attribute named key satisifes the function f.
|
||||||
|
func attributeSelector(key string, f func(string) bool) Selector {
|
||||||
|
key = toLowerASCII(key)
|
||||||
|
return func(n *html.Node) bool {
|
||||||
|
if n.Type != html.ElementNode {
|
||||||
|
return false
|
||||||
|
}
|
||||||
|
for _, a := range n.Attr {
|
||||||
|
if a.Key == key && f(a.Val) {
|
||||||
|
return true
|
||||||
|
}
|
||||||
|
}
|
||||||
|
return false
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// attributeExistsSelector returns a Selector that matches elements that have
|
||||||
|
// an attribute named key.
|
||||||
|
func attributeExistsSelector(key string) Selector {
|
||||||
|
return attributeSelector(key, func(string) bool { return true })
|
||||||
|
}
|
||||||
|
|
||||||
|
// attributeEqualsSelector returns a Selector that matches elements where
|
||||||
|
// the attribute named key has the value val.
|
||||||
|
func attributeEqualsSelector(key, val string) Selector {
|
||||||
|
return attributeSelector(key,
|
||||||
|
func(s string) bool {
|
||||||
|
return s == val
|
||||||
|
})
|
||||||
|
}
|
||||||
|
|
||||||
|
// attributeNotEqualSelector returns a Selector that matches elements where
|
||||||
|
// the attribute named key does not have the value val.
|
||||||
|
func attributeNotEqualSelector(key, val string) Selector {
|
||||||
|
key = toLowerASCII(key)
|
||||||
|
return func(n *html.Node) bool {
|
||||||
|
if n.Type != html.ElementNode {
|
||||||
|
return false
|
||||||
|
}
|
||||||
|
for _, a := range n.Attr {
|
||||||
|
if a.Key == key && a.Val == val {
|
||||||
|
return false
|
||||||
|
}
|
||||||
|
}
|
||||||
|
return true
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// attributeIncludesSelector returns a Selector that matches elements where
|
||||||
|
// the attribute named key is a whitespace-separated list that includes val.
|
||||||
|
func attributeIncludesSelector(key, val string) Selector {
|
||||||
|
return attributeSelector(key,
|
||||||
|
func(s string) bool {
|
||||||
|
for s != "" {
|
||||||
|
i := strings.IndexAny(s, " \t\r\n\f")
|
||||||
|
if i == -1 {
|
||||||
|
return s == val
|
||||||
|
}
|
||||||
|
if s[:i] == val {
|
||||||
|
return true
|
||||||
|
}
|
||||||
|
s = s[i+1:]
|
||||||
|
}
|
||||||
|
return false
|
||||||
|
})
|
||||||
|
}
|
||||||
|
|
||||||
|
// attributeDashmatchSelector returns a Selector that matches elements where
|
||||||
|
// the attribute named key equals val or starts with val plus a hyphen.
|
||||||
|
func attributeDashmatchSelector(key, val string) Selector {
|
||||||
|
return attributeSelector(key,
|
||||||
|
func(s string) bool {
|
||||||
|
if s == val {
|
||||||
|
return true
|
||||||
|
}
|
||||||
|
if len(s) <= len(val) {
|
||||||
|
return false
|
||||||
|
}
|
||||||
|
if s[:len(val)] == val && s[len(val)] == '-' {
|
||||||
|
return true
|
||||||
|
}
|
||||||
|
return false
|
||||||
|
})
|
||||||
|
}
|
||||||
|
|
||||||
|
// attributePrefixSelector returns a Selector that matches elements where
|
||||||
|
// the attribute named key starts with val.
|
||||||
|
func attributePrefixSelector(key, val string) Selector {
|
||||||
|
return attributeSelector(key,
|
||||||
|
func(s string) bool {
|
||||||
|
if strings.TrimSpace(s) == "" {
|
||||||
|
return false
|
||||||
|
}
|
||||||
|
return strings.HasPrefix(s, val)
|
||||||
|
})
|
||||||
|
}
|
||||||
|
|
||||||
|
// attributeSuffixSelector returns a Selector that matches elements where
|
||||||
|
// the attribute named key ends with val.
|
||||||
|
func attributeSuffixSelector(key, val string) Selector {
|
||||||
|
return attributeSelector(key,
|
||||||
|
func(s string) bool {
|
||||||
|
if strings.TrimSpace(s) == "" {
|
||||||
|
return false
|
||||||
|
}
|
||||||
|
return strings.HasSuffix(s, val)
|
||||||
|
})
|
||||||
|
}
|
||||||
|
|
||||||
|
// attributeSubstringSelector returns a Selector that matches nodes where
|
||||||
|
// the attribute named key contains val.
|
||||||
|
func attributeSubstringSelector(key, val string) Selector {
|
||||||
|
return attributeSelector(key,
|
||||||
|
func(s string) bool {
|
||||||
|
if strings.TrimSpace(s) == "" {
|
||||||
|
return false
|
||||||
|
}
|
||||||
|
return strings.Contains(s, val)
|
||||||
|
})
|
||||||
|
}
|
||||||
|
|
||||||
|
// attributeRegexSelector returns a Selector that matches nodes where
|
||||||
|
// the attribute named key matches the regular expression rx
|
||||||
|
func attributeRegexSelector(key string, rx *regexp.Regexp) Selector {
|
||||||
|
return attributeSelector(key,
|
||||||
|
func(s string) bool {
|
||||||
|
return rx.MatchString(s)
|
||||||
|
})
|
||||||
|
}
|
||||||
|
|
||||||
|
// intersectionSelector returns a selector that matches nodes that match
|
||||||
|
// both a and b.
|
||||||
|
func intersectionSelector(a, b Selector) Selector {
|
||||||
|
return func(n *html.Node) bool {
|
||||||
|
return a(n) && b(n)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// unionSelector returns a selector that matches elements that match
|
||||||
|
// either a or b.
|
||||||
|
func unionSelector(a, b Selector) Selector {
|
||||||
|
return func(n *html.Node) bool {
|
||||||
|
return a(n) || b(n)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// negatedSelector returns a selector that matches elements that do not match a.
|
||||||
|
func negatedSelector(a Selector) Selector {
|
||||||
|
return func(n *html.Node) bool {
|
||||||
|
if n.Type != html.ElementNode {
|
||||||
|
return false
|
||||||
|
}
|
||||||
|
return !a(n)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// writeNodeText writes the text contained in n and its descendants to b.
|
||||||
|
func writeNodeText(n *html.Node, b *bytes.Buffer) {
|
||||||
|
switch n.Type {
|
||||||
|
case html.TextNode:
|
||||||
|
b.WriteString(n.Data)
|
||||||
|
case html.ElementNode:
|
||||||
|
for c := n.FirstChild; c != nil; c = c.NextSibling {
|
||||||
|
writeNodeText(c, b)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// nodeText returns the text contained in n and its descendants.
|
||||||
|
func nodeText(n *html.Node) string {
|
||||||
|
var b bytes.Buffer
|
||||||
|
writeNodeText(n, &b)
|
||||||
|
return b.String()
|
||||||
|
}
|
||||||
|
|
||||||
|
// nodeOwnText returns the contents of the text nodes that are direct
|
||||||
|
// children of n.
|
||||||
|
func nodeOwnText(n *html.Node) string {
|
||||||
|
var b bytes.Buffer
|
||||||
|
for c := n.FirstChild; c != nil; c = c.NextSibling {
|
||||||
|
if c.Type == html.TextNode {
|
||||||
|
b.WriteString(c.Data)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
return b.String()
|
||||||
|
}
|
||||||
|
|
||||||
|
// textSubstrSelector returns a selector that matches nodes that
|
||||||
|
// contain the given text.
|
||||||
|
func textSubstrSelector(val string) Selector {
|
||||||
|
return func(n *html.Node) bool {
|
||||||
|
text := strings.ToLower(nodeText(n))
|
||||||
|
return strings.Contains(text, val)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// ownTextSubstrSelector returns a selector that matches nodes that
|
||||||
|
// directly contain the given text
|
||||||
|
func ownTextSubstrSelector(val string) Selector {
|
||||||
|
return func(n *html.Node) bool {
|
||||||
|
text := strings.ToLower(nodeOwnText(n))
|
||||||
|
return strings.Contains(text, val)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// textRegexSelector returns a selector that matches nodes whose text matches
|
||||||
|
// the specified regular expression
|
||||||
|
func textRegexSelector(rx *regexp.Regexp) Selector {
|
||||||
|
return func(n *html.Node) bool {
|
||||||
|
return rx.MatchString(nodeText(n))
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// ownTextRegexSelector returns a selector that matches nodes whose text
|
||||||
|
// directly matches the specified regular expression
|
||||||
|
func ownTextRegexSelector(rx *regexp.Regexp) Selector {
|
||||||
|
return func(n *html.Node) bool {
|
||||||
|
return rx.MatchString(nodeOwnText(n))
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// hasChildSelector returns a selector that matches elements
|
||||||
|
// with a child that matches a.
|
||||||
|
func hasChildSelector(a Selector) Selector {
|
||||||
|
return func(n *html.Node) bool {
|
||||||
|
if n.Type != html.ElementNode {
|
||||||
|
return false
|
||||||
|
}
|
||||||
|
return hasChildMatch(n, a)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// hasDescendantSelector returns a selector that matches elements
|
||||||
|
// with any descendant that matches a.
|
||||||
|
func hasDescendantSelector(a Selector) Selector {
|
||||||
|
return func(n *html.Node) bool {
|
||||||
|
if n.Type != html.ElementNode {
|
||||||
|
return false
|
||||||
|
}
|
||||||
|
return hasDescendantMatch(n, a)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// nthChildSelector returns a selector that implements :nth-child(an+b).
|
||||||
|
// If last is true, implements :nth-last-child instead.
|
||||||
|
// If ofType is true, implements :nth-of-type instead.
|
||||||
|
func nthChildSelector(a, b int, last, ofType bool) Selector {
|
||||||
|
return func(n *html.Node) bool {
|
||||||
|
if n.Type != html.ElementNode {
|
||||||
|
return false
|
||||||
|
}
|
||||||
|
|
||||||
|
parent := n.Parent
|
||||||
|
if parent == nil {
|
||||||
|
return false
|
||||||
|
}
|
||||||
|
|
||||||
|
if parent.Type == html.DocumentNode {
|
||||||
|
return false
|
||||||
|
}
|
||||||
|
|
||||||
|
i := -1
|
||||||
|
count := 0
|
||||||
|
for c := parent.FirstChild; c != nil; c = c.NextSibling {
|
||||||
|
if (c.Type != html.ElementNode) || (ofType && c.Data != n.Data) {
|
||||||
|
continue
|
||||||
|
}
|
||||||
|
count++
|
||||||
|
if c == n {
|
||||||
|
i = count
|
||||||
|
if !last {
|
||||||
|
break
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
if i == -1 {
|
||||||
|
// This shouldn't happen, since n should always be one of its parent's children.
|
||||||
|
return false
|
||||||
|
}
|
||||||
|
|
||||||
|
if last {
|
||||||
|
i = count - i + 1
|
||||||
|
}
|
||||||
|
|
||||||
|
i -= b
|
||||||
|
if a == 0 {
|
||||||
|
return i == 0
|
||||||
|
}
|
||||||
|
|
||||||
|
return i%a == 0 && i/a >= 0
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// simpleNthChildSelector returns a selector that implements :nth-child(b).
|
||||||
|
// If ofType is true, implements :nth-of-type instead.
|
||||||
|
func simpleNthChildSelector(b int, ofType bool) Selector {
|
||||||
|
return func(n *html.Node) bool {
|
||||||
|
if n.Type != html.ElementNode {
|
||||||
|
return false
|
||||||
|
}
|
||||||
|
|
||||||
|
parent := n.Parent
|
||||||
|
if parent == nil {
|
||||||
|
return false
|
||||||
|
}
|
||||||
|
|
||||||
|
if parent.Type == html.DocumentNode {
|
||||||
|
return false
|
||||||
|
}
|
||||||
|
|
||||||
|
count := 0
|
||||||
|
for c := parent.FirstChild; c != nil; c = c.NextSibling {
|
||||||
|
if c.Type != html.ElementNode || (ofType && c.Data != n.Data) {
|
||||||
|
continue
|
||||||
|
}
|
||||||
|
count++
|
||||||
|
if c == n {
|
||||||
|
return count == b
|
||||||
|
}
|
||||||
|
if count >= b {
|
||||||
|
return false
|
||||||
|
}
|
||||||
|
}
|
||||||
|
return false
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// simpleNthLastChildSelector returns a selector that implements
|
||||||
|
// :nth-last-child(b). If ofType is true, implements :nth-last-of-type
|
||||||
|
// instead.
|
||||||
|
func simpleNthLastChildSelector(b int, ofType bool) Selector {
|
||||||
|
return func(n *html.Node) bool {
|
||||||
|
if n.Type != html.ElementNode {
|
||||||
|
return false
|
||||||
|
}
|
||||||
|
|
||||||
|
parent := n.Parent
|
||||||
|
if parent == nil {
|
||||||
|
return false
|
||||||
|
}
|
||||||
|
|
||||||
|
if parent.Type == html.DocumentNode {
|
||||||
|
return false
|
||||||
|
}
|
||||||
|
|
||||||
|
count := 0
|
||||||
|
for c := parent.LastChild; c != nil; c = c.PrevSibling {
|
||||||
|
if c.Type != html.ElementNode || (ofType && c.Data != n.Data) {
|
||||||
|
continue
|
||||||
|
}
|
||||||
|
count++
|
||||||
|
if c == n {
|
||||||
|
return count == b
|
||||||
|
}
|
||||||
|
if count >= b {
|
||||||
|
return false
|
||||||
|
}
|
||||||
|
}
|
||||||
|
return false
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// onlyChildSelector returns a selector that implements :only-child.
|
||||||
|
// If ofType is true, it implements :only-of-type instead.
|
||||||
|
func onlyChildSelector(ofType bool) Selector {
|
||||||
|
return func(n *html.Node) bool {
|
||||||
|
if n.Type != html.ElementNode {
|
||||||
|
return false
|
||||||
|
}
|
||||||
|
|
||||||
|
parent := n.Parent
|
||||||
|
if parent == nil {
|
||||||
|
return false
|
||||||
|
}
|
||||||
|
|
||||||
|
if parent.Type == html.DocumentNode {
|
||||||
|
return false
|
||||||
|
}
|
||||||
|
|
||||||
|
count := 0
|
||||||
|
for c := parent.FirstChild; c != nil; c = c.NextSibling {
|
||||||
|
if (c.Type != html.ElementNode) || (ofType && c.Data != n.Data) {
|
||||||
|
continue
|
||||||
|
}
|
||||||
|
count++
|
||||||
|
if count > 1 {
|
||||||
|
return false
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
return count == 1
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// inputSelector is a Selector that matches input, select, textarea and button elements.
|
||||||
|
func inputSelector(n *html.Node) bool {
|
||||||
|
return n.Type == html.ElementNode && (n.Data == "input" || n.Data == "select" || n.Data == "textarea" || n.Data == "button")
|
||||||
|
}
|
||||||
|
|
||||||
|
// emptyElementSelector is a Selector that matches empty elements.
|
||||||
|
func emptyElementSelector(n *html.Node) bool {
|
||||||
|
if n.Type != html.ElementNode {
|
||||||
|
return false
|
||||||
|
}
|
||||||
|
|
||||||
|
for c := n.FirstChild; c != nil; c = c.NextSibling {
|
||||||
|
switch c.Type {
|
||||||
|
case html.ElementNode, html.TextNode:
|
||||||
|
return false
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
return true
|
||||||
|
}
|
||||||
|
|
||||||
|
// descendantSelector returns a Selector that matches an element if
|
||||||
|
// it matches d and has an ancestor that matches a.
|
||||||
|
func descendantSelector(a, d Selector) Selector {
|
||||||
|
return func(n *html.Node) bool {
|
||||||
|
if !d(n) {
|
||||||
|
return false
|
||||||
|
}
|
||||||
|
|
||||||
|
for p := n.Parent; p != nil; p = p.Parent {
|
||||||
|
if a(p) {
|
||||||
|
return true
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
return false
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// childSelector returns a Selector that matches an element if
|
||||||
|
// it matches d and its parent matches a.
|
||||||
|
func childSelector(a, d Selector) Selector {
|
||||||
|
return func(n *html.Node) bool {
|
||||||
|
return d(n) && n.Parent != nil && a(n.Parent)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// siblingSelector returns a Selector that matches an element
|
||||||
|
// if it matches s2 and in is preceded by an element that matches s1.
|
||||||
|
// If adjacent is true, the sibling must be immediately before the element.
|
||||||
|
func siblingSelector(s1, s2 Selector, adjacent bool) Selector {
|
||||||
|
return func(n *html.Node) bool {
|
||||||
|
if !s2(n) {
|
||||||
|
return false
|
||||||
|
}
|
||||||
|
|
||||||
|
if adjacent {
|
||||||
|
for n = n.PrevSibling; n != nil; n = n.PrevSibling {
|
||||||
|
if n.Type == html.TextNode || n.Type == html.CommentNode {
|
||||||
|
continue
|
||||||
|
}
|
||||||
|
return s1(n)
|
||||||
|
}
|
||||||
|
return false
|
||||||
|
}
|
||||||
|
|
||||||
|
// Walk backwards looking for element that matches s1
|
||||||
|
for c := n.PrevSibling; c != nil; c = c.PrevSibling {
|
||||||
|
if s1(c) {
|
||||||
|
return true
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
return false
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// rootSelector implements :root
|
||||||
|
func rootSelector(n *html.Node) bool {
|
||||||
|
if n.Type != html.ElementNode {
|
||||||
|
return false
|
||||||
|
}
|
||||||
|
if n.Parent == nil {
|
||||||
|
return false
|
||||||
|
}
|
||||||
|
return n.Parent.Type == html.DocumentNode
|
||||||
|
}
|
22
vendor/github.com/chilts/sid/LICENSE
generated
vendored
Normal file
22
vendor/github.com/chilts/sid/LICENSE
generated
vendored
Normal file
|
@ -0,0 +1,22 @@
|
||||||
|
MIT License
|
||||||
|
|
||||||
|
Copyright (c) 2017 Andrew Chilton <andychilton@gmail.com>
|
||||||
|
|
||||||
|
Permission is hereby granted, free of charge, to any person obtaining a copy
|
||||||
|
of this software and associated documentation files (the "Software"), to deal
|
||||||
|
in the Software without restriction, including without limitation the rights
|
||||||
|
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
|
||||||
|
copies of the Software, and to permit persons to whom the Software is
|
||||||
|
furnished to do so, subject to the following conditions:
|
||||||
|
|
||||||
|
The above copyright notice and this permission notice shall be included in all
|
||||||
|
copies or substantial portions of the Software.
|
||||||
|
|
||||||
|
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
|
||||||
|
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
|
||||||
|
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
|
||||||
|
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
|
||||||
|
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
|
||||||
|
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
|
||||||
|
SOFTWARE.
|
||||||
|
|
42
vendor/github.com/chilts/sid/ReadMe.md
generated
vendored
Normal file
42
vendor/github.com/chilts/sid/ReadMe.md
generated
vendored
Normal file
|
@ -0,0 +1,42 @@
|
||||||
|
# sid : generate sortable identifiers
|
||||||
|
|
||||||
|
## Overview
|
||||||
|
|
||||||
|
[![GoDoc](https://godoc.org/github.com/chilts/sid?status.svg)](https://godoc.org/github.com/chilts/sid)
|
||||||
|
[![Build Status](https://travis-ci.org/chilts/sid.svg?branch=master)](https://travis-ci.org/chilts/sid)
|
||||||
|
[![Code Climate](https://codeclimate.com/github/chilts/sid/badges/gpa.svg)](https://codeclimate.com/github/chilts/sid)
|
||||||
|
[![Go Report Card](https://goreportcard.com/badge/github.com/chilts/sid)](https://goreportcard.com/report/github.com/chilts/sid)
|
||||||
|
|
||||||
|
This package is simple and only provides one function. The aim here is not pure speed, it is for an easy use-case
|
||||||
|
without having to worry about goroutines and locking.
|
||||||
|
|
||||||
|
## Install
|
||||||
|
|
||||||
|
```
|
||||||
|
go get github.com/chilts/sid
|
||||||
|
```
|
||||||
|
|
||||||
|
## Example
|
||||||
|
|
||||||
|
```
|
||||||
|
id1 := sid.Id()
|
||||||
|
id2 := sid.Id()
|
||||||
|
|
||||||
|
fmt.Printf("id1 = %s\n", id1)
|
||||||
|
fmt.Printf("id2 = %s\n", id2)
|
||||||
|
|
||||||
|
// -> "id1 = 1IeSBAWW83j-2wgJ4PUtlAr"
|
||||||
|
// -> "id2 = 1IeSBAWW9kK-0cDG64GQgGJ"
|
||||||
|
```
|
||||||
|
|
||||||
|
## Author
|
||||||
|
|
||||||
|
By [Andrew Chilton](https://chilts.org/), [@twitter](https://twitter.com/andychilton).
|
||||||
|
|
||||||
|
For [AppsAttic](https://appsattic.com/), [@AppsAttic](https://twitter.com/AppsAttic).
|
||||||
|
|
||||||
|
## License
|
||||||
|
|
||||||
|
[MIT](https://publish.li/mit-qLQqmVTO).
|
||||||
|
|
||||||
|
(Ends)
|
107
vendor/github.com/chilts/sid/sid.go
generated
vendored
Normal file
107
vendor/github.com/chilts/sid/sid.go
generated
vendored
Normal file
|
@ -0,0 +1,107 @@
|
||||||
|
// Package sid provides the ability to generate Sortable Identifiers. These are also universally unique.
|
||||||
|
//
|
||||||
|
// id1 := sid.Id()
|
||||||
|
// id2 := sid.Id()
|
||||||
|
// fmt.Printf("id1 = %s\n", id1)
|
||||||
|
// fmt.Printf("id2 = %s\n", id2)
|
||||||
|
// // -> "id1 = 1IeSBAWW83j-2wgJ4PUtlAr"
|
||||||
|
// // -> "id2 = 1IeSBAWW9kK-0cDG64GQgGJ"
|
||||||
|
//
|
||||||
|
// This package is simple and only provides one function. The aim here is not pure speed, it is for an easy use-case
|
||||||
|
// without having to worry about goroutines and locking.
|
||||||
|
package sid
|
||||||
|
|
||||||
|
import (
|
||||||
|
"math/rand"
|
||||||
|
"strconv"
|
||||||
|
"strings"
|
||||||
|
"sync"
|
||||||
|
"time"
|
||||||
|
)
|
||||||
|
|
||||||
|
func init() {
|
||||||
|
rand.Seed(time.Now().UTC().UnixNano())
|
||||||
|
}
|
||||||
|
|
||||||
|
// Remember the lastTime so that if (by chance) we get the same NanoSecond, we just incrememt the last random number.
|
||||||
|
var mu = &sync.Mutex{}
|
||||||
|
var lastTime int64
|
||||||
|
var lastRand int64
|
||||||
|
var chars = make([]string, 11, 11)
|
||||||
|
|
||||||
|
// 64 chars but ordered by ASCII
|
||||||
|
const base64 string = "0123456789ABCDEFGHIJKLMNOPQRSTUVWXYZ_abcdefghijklmnopqrstuvwxyz~"
|
||||||
|
|
||||||
|
func toStr(now int64) string {
|
||||||
|
// now do the generation (backwards, so we just %64 then /64 along the way)
|
||||||
|
for i := 10; i >= 0; i-- {
|
||||||
|
index := now % 64
|
||||||
|
chars[i] = string(base64[index])
|
||||||
|
now = now / 64
|
||||||
|
}
|
||||||
|
|
||||||
|
return strings.Join(chars, "")
|
||||||
|
}
|
||||||
|
|
||||||
|
// IdBase64 returns a 23 char string based on timestamp and a random number. The format is "XXXXXXXXXXX-YYYYYYYYYYY"
|
||||||
|
// where X is the timestamp and Y is the random number. If (by any chance) this is called in the same nanosecond, the
|
||||||
|
// random number is incremented instead of a new one being generated. This makes sure that two consecutive Ids
|
||||||
|
// generated in the same goroutine also ensure those Ids are also sortable.
|
||||||
|
//
|
||||||
|
// It is safe to call from different goroutines since it has it's own locking.
|
||||||
|
func IdBase64() string {
|
||||||
|
// lock for lastTime, lastRand, and chars
|
||||||
|
mu.Lock()
|
||||||
|
defer mu.Unlock()
|
||||||
|
|
||||||
|
now := time.Now().UTC().UnixNano()
|
||||||
|
var r int64
|
||||||
|
|
||||||
|
// if we have the same time, just inc lastRand, else create a new one
|
||||||
|
if now == lastTime {
|
||||||
|
lastRand++
|
||||||
|
} else {
|
||||||
|
lastRand = rand.Int63()
|
||||||
|
}
|
||||||
|
r = lastRand
|
||||||
|
|
||||||
|
// remember this for next time
|
||||||
|
lastTime = now
|
||||||
|
|
||||||
|
return toStr(now) + "-" + toStr(r)
|
||||||
|
}
|
||||||
|
|
||||||
|
// Id returns a 23 char string based on timestamp and a random number. The format is "XXXXXXXXXXX-YYYYYYYYYYY" where X
|
||||||
|
// is the timestamp and Y is the random number. If (by any chance) this is called in the same nanosecond, the random
|
||||||
|
// number is incremented instead of a new one being generated. This makes sure that two consecutive Ids generated in
|
||||||
|
// the same goroutine also ensure those Ids are also sortable.
|
||||||
|
//
|
||||||
|
// It is safe to call from different goroutines since it has it's own locking.
|
||||||
|
func Id() string {
|
||||||
|
// lock for lastTime, lastRand, and chars
|
||||||
|
mu.Lock()
|
||||||
|
defer mu.Unlock()
|
||||||
|
|
||||||
|
now := time.Now().UTC().UnixNano()
|
||||||
|
var r int64
|
||||||
|
|
||||||
|
// if we have the same time, just inc lastRand, else create a new one
|
||||||
|
if now == lastTime {
|
||||||
|
lastRand++
|
||||||
|
} else {
|
||||||
|
lastRand = rand.Int63()
|
||||||
|
}
|
||||||
|
r = lastRand
|
||||||
|
|
||||||
|
// remember this for next time
|
||||||
|
lastTime = now
|
||||||
|
|
||||||
|
nowStr := strconv.FormatInt(now, 10)
|
||||||
|
rStr := strconv.FormatInt(r, 10)
|
||||||
|
|
||||||
|
for len(rStr) < 19 {
|
||||||
|
rStr = "0" + rStr
|
||||||
|
}
|
||||||
|
|
||||||
|
return nowStr + "-" + rStr
|
||||||
|
}
|
21
vendor/github.com/mattn/go-runewidth/LICENSE
generated
vendored
Normal file
21
vendor/github.com/mattn/go-runewidth/LICENSE
generated
vendored
Normal file
|
@ -0,0 +1,21 @@
|
||||||
|
The MIT License (MIT)
|
||||||
|
|
||||||
|
Copyright (c) 2016 Yasuhiro Matsumoto
|
||||||
|
|
||||||
|
Permission is hereby granted, free of charge, to any person obtaining a copy
|
||||||
|
of this software and associated documentation files (the "Software"), to deal
|
||||||
|
in the Software without restriction, including without limitation the rights
|
||||||
|
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
|
||||||
|
copies of the Software, and to permit persons to whom the Software is
|
||||||
|
furnished to do so, subject to the following conditions:
|
||||||
|
|
||||||
|
The above copyright notice and this permission notice shall be included in all
|
||||||
|
copies or substantial portions of the Software.
|
||||||
|
|
||||||
|
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
|
||||||
|
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
|
||||||
|
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
|
||||||
|
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
|
||||||
|
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
|
||||||
|
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
|
||||||
|
SOFTWARE.
|
27
vendor/github.com/mattn/go-runewidth/README.mkd
generated
vendored
Normal file
27
vendor/github.com/mattn/go-runewidth/README.mkd
generated
vendored
Normal file
|
@ -0,0 +1,27 @@
|
||||||
|
go-runewidth
|
||||||
|
============
|
||||||
|
|
||||||
|
[![Build Status](https://travis-ci.org/mattn/go-runewidth.png?branch=master)](https://travis-ci.org/mattn/go-runewidth)
|
||||||
|
[![Coverage Status](https://coveralls.io/repos/mattn/go-runewidth/badge.png?branch=HEAD)](https://coveralls.io/r/mattn/go-runewidth?branch=HEAD)
|
||||||
|
[![GoDoc](https://godoc.org/github.com/mattn/go-runewidth?status.svg)](http://godoc.org/github.com/mattn/go-runewidth)
|
||||||
|
[![Go Report Card](https://goreportcard.com/badge/github.com/mattn/go-runewidth)](https://goreportcard.com/report/github.com/mattn/go-runewidth)
|
||||||
|
|
||||||
|
Provides functions to get fixed width of the character or string.
|
||||||
|
|
||||||
|
Usage
|
||||||
|
-----
|
||||||
|
|
||||||
|
```go
|
||||||
|
runewidth.StringWidth("つのだ☆HIRO") == 12
|
||||||
|
```
|
||||||
|
|
||||||
|
|
||||||
|
Author
|
||||||
|
------
|
||||||
|
|
||||||
|
Yasuhiro Matsumoto
|
||||||
|
|
||||||
|
License
|
||||||
|
-------
|
||||||
|
|
||||||
|
under the MIT License: http://mattn.mit-license.org/2013
|
3
vendor/github.com/mattn/go-runewidth/go.mod
generated
vendored
Normal file
3
vendor/github.com/mattn/go-runewidth/go.mod
generated
vendored
Normal file
|
@ -0,0 +1,3 @@
|
||||||
|
module github.com/mattn/go-runewidth
|
||||||
|
|
||||||
|
go 1.9
|
977
vendor/github.com/mattn/go-runewidth/runewidth.go
generated
vendored
Normal file
977
vendor/github.com/mattn/go-runewidth/runewidth.go
generated
vendored
Normal file
|
@ -0,0 +1,977 @@
|
||||||
|
package runewidth
|
||||||
|
|
||||||
|
import (
|
||||||
|
"os"
|
||||||
|
)
|
||||||
|
|
||||||
|
var (
|
||||||
|
// EastAsianWidth will be set true if the current locale is CJK
|
||||||
|
EastAsianWidth bool
|
||||||
|
|
||||||
|
// ZeroWidthJoiner is flag to set to use UTR#51 ZWJ
|
||||||
|
ZeroWidthJoiner bool
|
||||||
|
|
||||||
|
// DefaultCondition is a condition in current locale
|
||||||
|
DefaultCondition = &Condition{}
|
||||||
|
)
|
||||||
|
|
||||||
|
func init() {
|
||||||
|
handleEnv()
|
||||||
|
}
|
||||||
|
|
||||||
|
func handleEnv() {
|
||||||
|
env := os.Getenv("RUNEWIDTH_EASTASIAN")
|
||||||
|
if env == "" {
|
||||||
|
EastAsianWidth = IsEastAsian()
|
||||||
|
} else {
|
||||||
|
EastAsianWidth = env == "1"
|
||||||
|
}
|
||||||
|
// update DefaultCondition
|
||||||
|
DefaultCondition.EastAsianWidth = EastAsianWidth
|
||||||
|
DefaultCondition.ZeroWidthJoiner = ZeroWidthJoiner
|
||||||
|
}
|
||||||
|
|
||||||
|
type interval struct {
|
||||||
|
first rune
|
||||||
|
last rune
|
||||||
|
}
|
||||||
|
|
||||||
|
type table []interval
|
||||||
|
|
||||||
|
func inTables(r rune, ts ...table) bool {
|
||||||
|
for _, t := range ts {
|
||||||
|
if inTable(r, t) {
|
||||||
|
return true
|
||||||
|
}
|
||||||
|
}
|
||||||
|
return false
|
||||||
|
}
|
||||||
|
|
||||||
|
func inTable(r rune, t table) bool {
|
||||||
|
// func (t table) IncludesRune(r rune) bool {
|
||||||
|
if r < t[0].first {
|
||||||
|
return false
|
||||||
|
}
|
||||||
|
|
||||||
|
bot := 0
|
||||||
|
top := len(t) - 1
|
||||||
|
for top >= bot {
|
||||||
|
mid := (bot + top) >> 1
|
||||||
|
|
||||||
|
switch {
|
||||||
|
case t[mid].last < r:
|
||||||
|
bot = mid + 1
|
||||||
|
case t[mid].first > r:
|
||||||
|
top = mid - 1
|
||||||
|
default:
|
||||||
|
return true
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
return false
|
||||||
|
}
|
||||||
|
|
||||||
|
var private = table{
|
||||||
|
{0x00E000, 0x00F8FF}, {0x0F0000, 0x0FFFFD}, {0x100000, 0x10FFFD},
|
||||||
|
}
|
||||||
|
|
||||||
|
var nonprint = table{
|
||||||
|
{0x0000, 0x001F}, {0x007F, 0x009F}, {0x00AD, 0x00AD},
|
||||||
|
{0x070F, 0x070F}, {0x180B, 0x180E}, {0x200B, 0x200F},
|
||||||
|
{0x2028, 0x202E}, {0x206A, 0x206F}, {0xD800, 0xDFFF},
|
||||||
|
{0xFEFF, 0xFEFF}, {0xFFF9, 0xFFFB}, {0xFFFE, 0xFFFF},
|
||||||
|
}
|
||||||
|
|
||||||
|
var combining = table{
|
||||||
|
{0x0300, 0x036F}, {0x0483, 0x0489}, {0x0591, 0x05BD},
|
||||||
|
{0x05BF, 0x05BF}, {0x05C1, 0x05C2}, {0x05C4, 0x05C5},
|
||||||
|
{0x05C7, 0x05C7}, {0x0610, 0x061A}, {0x064B, 0x065F},
|
||||||
|
{0x0670, 0x0670}, {0x06D6, 0x06DC}, {0x06DF, 0x06E4},
|
||||||
|
{0x06E7, 0x06E8}, {0x06EA, 0x06ED}, {0x0711, 0x0711},
|
||||||
|
{0x0730, 0x074A}, {0x07A6, 0x07B0}, {0x07EB, 0x07F3},
|
||||||
|
{0x0816, 0x0819}, {0x081B, 0x0823}, {0x0825, 0x0827},
|
||||||
|
{0x0829, 0x082D}, {0x0859, 0x085B}, {0x08D4, 0x08E1},
|
||||||
|
{0x08E3, 0x0903}, {0x093A, 0x093C}, {0x093E, 0x094F},
|
||||||
|
{0x0951, 0x0957}, {0x0962, 0x0963}, {0x0981, 0x0983},
|
||||||
|
{0x09BC, 0x09BC}, {0x09BE, 0x09C4}, {0x09C7, 0x09C8},
|
||||||
|
{0x09CB, 0x09CD}, {0x09D7, 0x09D7}, {0x09E2, 0x09E3},
|
||||||
|
{0x0A01, 0x0A03}, {0x0A3C, 0x0A3C}, {0x0A3E, 0x0A42},
|
||||||
|
{0x0A47, 0x0A48}, {0x0A4B, 0x0A4D}, {0x0A51, 0x0A51},
|
||||||
|
{0x0A70, 0x0A71}, {0x0A75, 0x0A75}, {0x0A81, 0x0A83},
|
||||||
|
{0x0ABC, 0x0ABC}, {0x0ABE, 0x0AC5}, {0x0AC7, 0x0AC9},
|
||||||
|
{0x0ACB, 0x0ACD}, {0x0AE2, 0x0AE3}, {0x0B01, 0x0B03},
|
||||||
|
{0x0B3C, 0x0B3C}, {0x0B3E, 0x0B44}, {0x0B47, 0x0B48},
|
||||||
|
{0x0B4B, 0x0B4D}, {0x0B56, 0x0B57}, {0x0B62, 0x0B63},
|
||||||
|
{0x0B82, 0x0B82}, {0x0BBE, 0x0BC2}, {0x0BC6, 0x0BC8},
|
||||||
|
{0x0BCA, 0x0BCD}, {0x0BD7, 0x0BD7}, {0x0C00, 0x0C03},
|
||||||
|
{0x0C3E, 0x0C44}, {0x0C46, 0x0C48}, {0x0C4A, 0x0C4D},
|
||||||
|
{0x0C55, 0x0C56}, {0x0C62, 0x0C63}, {0x0C81, 0x0C83},
|
||||||
|
{0x0CBC, 0x0CBC}, {0x0CBE, 0x0CC4}, {0x0CC6, 0x0CC8},
|
||||||
|
{0x0CCA, 0x0CCD}, {0x0CD5, 0x0CD6}, {0x0CE2, 0x0CE3},
|
||||||
|
{0x0D01, 0x0D03}, {0x0D3E, 0x0D44}, {0x0D46, 0x0D48},
|
||||||
|
{0x0D4A, 0x0D4D}, {0x0D57, 0x0D57}, {0x0D62, 0x0D63},
|
||||||
|
{0x0D82, 0x0D83}, {0x0DCA, 0x0DCA}, {0x0DCF, 0x0DD4},
|
||||||
|
{0x0DD6, 0x0DD6}, {0x0DD8, 0x0DDF}, {0x0DF2, 0x0DF3},
|
||||||
|
{0x0E31, 0x0E31}, {0x0E34, 0x0E3A}, {0x0E47, 0x0E4E},
|
||||||
|
{0x0EB1, 0x0EB1}, {0x0EB4, 0x0EB9}, {0x0EBB, 0x0EBC},
|
||||||
|
{0x0EC8, 0x0ECD}, {0x0F18, 0x0F19}, {0x0F35, 0x0F35},
|
||||||
|
{0x0F37, 0x0F37}, {0x0F39, 0x0F39}, {0x0F3E, 0x0F3F},
|
||||||
|
{0x0F71, 0x0F84}, {0x0F86, 0x0F87}, {0x0F8D, 0x0F97},
|
||||||
|
{0x0F99, 0x0FBC}, {0x0FC6, 0x0FC6}, {0x102B, 0x103E},
|
||||||
|
{0x1056, 0x1059}, {0x105E, 0x1060}, {0x1062, 0x1064},
|
||||||
|
{0x1067, 0x106D}, {0x1071, 0x1074}, {0x1082, 0x108D},
|
||||||
|
{0x108F, 0x108F}, {0x109A, 0x109D}, {0x135D, 0x135F},
|
||||||
|
{0x1712, 0x1714}, {0x1732, 0x1734}, {0x1752, 0x1753},
|
||||||
|
{0x1772, 0x1773}, {0x17B4, 0x17D3}, {0x17DD, 0x17DD},
|
||||||
|
{0x180B, 0x180D}, {0x1885, 0x1886}, {0x18A9, 0x18A9},
|
||||||
|
{0x1920, 0x192B}, {0x1930, 0x193B}, {0x1A17, 0x1A1B},
|
||||||
|
{0x1A55, 0x1A5E}, {0x1A60, 0x1A7C}, {0x1A7F, 0x1A7F},
|
||||||
|
{0x1AB0, 0x1ABE}, {0x1B00, 0x1B04}, {0x1B34, 0x1B44},
|
||||||
|
{0x1B6B, 0x1B73}, {0x1B80, 0x1B82}, {0x1BA1, 0x1BAD},
|
||||||
|
{0x1BE6, 0x1BF3}, {0x1C24, 0x1C37}, {0x1CD0, 0x1CD2},
|
||||||
|
{0x1CD4, 0x1CE8}, {0x1CED, 0x1CED}, {0x1CF2, 0x1CF4},
|
||||||
|
{0x1CF8, 0x1CF9}, {0x1DC0, 0x1DF5}, {0x1DFB, 0x1DFF},
|
||||||
|
{0x20D0, 0x20F0}, {0x2CEF, 0x2CF1}, {0x2D7F, 0x2D7F},
|
||||||
|
{0x2DE0, 0x2DFF}, {0x302A, 0x302F}, {0x3099, 0x309A},
|
||||||
|
{0xA66F, 0xA672}, {0xA674, 0xA67D}, {0xA69E, 0xA69F},
|
||||||
|
{0xA6F0, 0xA6F1}, {0xA802, 0xA802}, {0xA806, 0xA806},
|
||||||
|
{0xA80B, 0xA80B}, {0xA823, 0xA827}, {0xA880, 0xA881},
|
||||||
|
{0xA8B4, 0xA8C5}, {0xA8E0, 0xA8F1}, {0xA926, 0xA92D},
|
||||||
|
{0xA947, 0xA953}, {0xA980, 0xA983}, {0xA9B3, 0xA9C0},
|
||||||
|
{0xA9E5, 0xA9E5}, {0xAA29, 0xAA36}, {0xAA43, 0xAA43},
|
||||||
|
{0xAA4C, 0xAA4D}, {0xAA7B, 0xAA7D}, {0xAAB0, 0xAAB0},
|
||||||
|
{0xAAB2, 0xAAB4}, {0xAAB7, 0xAAB8}, {0xAABE, 0xAABF},
|
||||||
|
{0xAAC1, 0xAAC1}, {0xAAEB, 0xAAEF}, {0xAAF5, 0xAAF6},
|
||||||
|
{0xABE3, 0xABEA}, {0xABEC, 0xABED}, {0xFB1E, 0xFB1E},
|
||||||
|
{0xFE00, 0xFE0F}, {0xFE20, 0xFE2F}, {0x101FD, 0x101FD},
|
||||||
|
{0x102E0, 0x102E0}, {0x10376, 0x1037A}, {0x10A01, 0x10A03},
|
||||||
|
{0x10A05, 0x10A06}, {0x10A0C, 0x10A0F}, {0x10A38, 0x10A3A},
|
||||||
|
{0x10A3F, 0x10A3F}, {0x10AE5, 0x10AE6}, {0x11000, 0x11002},
|
||||||
|
{0x11038, 0x11046}, {0x1107F, 0x11082}, {0x110B0, 0x110BA},
|
||||||
|
{0x11100, 0x11102}, {0x11127, 0x11134}, {0x11173, 0x11173},
|
||||||
|
{0x11180, 0x11182}, {0x111B3, 0x111C0}, {0x111CA, 0x111CC},
|
||||||
|
{0x1122C, 0x11237}, {0x1123E, 0x1123E}, {0x112DF, 0x112EA},
|
||||||
|
{0x11300, 0x11303}, {0x1133C, 0x1133C}, {0x1133E, 0x11344},
|
||||||
|
{0x11347, 0x11348}, {0x1134B, 0x1134D}, {0x11357, 0x11357},
|
||||||
|
{0x11362, 0x11363}, {0x11366, 0x1136C}, {0x11370, 0x11374},
|
||||||
|
{0x11435, 0x11446}, {0x114B0, 0x114C3}, {0x115AF, 0x115B5},
|
||||||
|
{0x115B8, 0x115C0}, {0x115DC, 0x115DD}, {0x11630, 0x11640},
|
||||||
|
{0x116AB, 0x116B7}, {0x1171D, 0x1172B}, {0x11C2F, 0x11C36},
|
||||||
|
{0x11C38, 0x11C3F}, {0x11C92, 0x11CA7}, {0x11CA9, 0x11CB6},
|
||||||
|
{0x16AF0, 0x16AF4}, {0x16B30, 0x16B36}, {0x16F51, 0x16F7E},
|
||||||
|
{0x16F8F, 0x16F92}, {0x1BC9D, 0x1BC9E}, {0x1D165, 0x1D169},
|
||||||
|
{0x1D16D, 0x1D172}, {0x1D17B, 0x1D182}, {0x1D185, 0x1D18B},
|
||||||
|
{0x1D1AA, 0x1D1AD}, {0x1D242, 0x1D244}, {0x1DA00, 0x1DA36},
|
||||||
|
{0x1DA3B, 0x1DA6C}, {0x1DA75, 0x1DA75}, {0x1DA84, 0x1DA84},
|
||||||
|
{0x1DA9B, 0x1DA9F}, {0x1DAA1, 0x1DAAF}, {0x1E000, 0x1E006},
|
||||||
|
{0x1E008, 0x1E018}, {0x1E01B, 0x1E021}, {0x1E023, 0x1E024},
|
||||||
|
{0x1E026, 0x1E02A}, {0x1E8D0, 0x1E8D6}, {0x1E944, 0x1E94A},
|
||||||
|
{0xE0100, 0xE01EF},
|
||||||
|
}
|
||||||
|
|
||||||
|
var doublewidth = table{
|
||||||
|
{0x1100, 0x115F}, {0x231A, 0x231B}, {0x2329, 0x232A},
|
||||||
|
{0x23E9, 0x23EC}, {0x23F0, 0x23F0}, {0x23F3, 0x23F3},
|
||||||
|
{0x25FD, 0x25FE}, {0x2614, 0x2615}, {0x2648, 0x2653},
|
||||||
|
{0x267F, 0x267F}, {0x2693, 0x2693}, {0x26A1, 0x26A1},
|
||||||
|
{0x26AA, 0x26AB}, {0x26BD, 0x26BE}, {0x26C4, 0x26C5},
|
||||||
|
{0x26CE, 0x26CE}, {0x26D4, 0x26D4}, {0x26EA, 0x26EA},
|
||||||
|
{0x26F2, 0x26F3}, {0x26F5, 0x26F5}, {0x26FA, 0x26FA},
|
||||||
|
{0x26FD, 0x26FD}, {0x2705, 0x2705}, {0x270A, 0x270B},
|
||||||
|
{0x2728, 0x2728}, {0x274C, 0x274C}, {0x274E, 0x274E},
|
||||||
|
{0x2753, 0x2755}, {0x2757, 0x2757}, {0x2795, 0x2797},
|
||||||
|
{0x27B0, 0x27B0}, {0x27BF, 0x27BF}, {0x2B1B, 0x2B1C},
|
||||||
|
{0x2B50, 0x2B50}, {0x2B55, 0x2B55}, {0x2E80, 0x2E99},
|
||||||
|
{0x2E9B, 0x2EF3}, {0x2F00, 0x2FD5}, {0x2FF0, 0x2FFB},
|
||||||
|
{0x3000, 0x303E}, {0x3041, 0x3096}, {0x3099, 0x30FF},
|
||||||
|
{0x3105, 0x312D}, {0x3131, 0x318E}, {0x3190, 0x31BA},
|
||||||
|
{0x31C0, 0x31E3}, {0x31F0, 0x321E}, {0x3220, 0x3247},
|
||||||
|
{0x3250, 0x32FE}, {0x3300, 0x4DBF}, {0x4E00, 0xA48C},
|
||||||
|
{0xA490, 0xA4C6}, {0xA960, 0xA97C}, {0xAC00, 0xD7A3},
|
||||||
|
{0xF900, 0xFAFF}, {0xFE10, 0xFE19}, {0xFE30, 0xFE52},
|
||||||
|
{0xFE54, 0xFE66}, {0xFE68, 0xFE6B}, {0xFF01, 0xFF60},
|
||||||
|
{0xFFE0, 0xFFE6}, {0x16FE0, 0x16FE0}, {0x17000, 0x187EC},
|
||||||
|
{0x18800, 0x18AF2}, {0x1B000, 0x1B001}, {0x1F004, 0x1F004},
|
||||||
|
{0x1F0CF, 0x1F0CF}, {0x1F18E, 0x1F18E}, {0x1F191, 0x1F19A},
|
||||||
|
{0x1F200, 0x1F202}, {0x1F210, 0x1F23B}, {0x1F240, 0x1F248},
|
||||||
|
{0x1F250, 0x1F251}, {0x1F300, 0x1F320}, {0x1F32D, 0x1F335},
|
||||||
|
{0x1F337, 0x1F37C}, {0x1F37E, 0x1F393}, {0x1F3A0, 0x1F3CA},
|
||||||
|
{0x1F3CF, 0x1F3D3}, {0x1F3E0, 0x1F3F0}, {0x1F3F4, 0x1F3F4},
|
||||||
|
{0x1F3F8, 0x1F43E}, {0x1F440, 0x1F440}, {0x1F442, 0x1F4FC},
|
||||||
|
{0x1F4FF, 0x1F53D}, {0x1F54B, 0x1F54E}, {0x1F550, 0x1F567},
|
||||||
|
{0x1F57A, 0x1F57A}, {0x1F595, 0x1F596}, {0x1F5A4, 0x1F5A4},
|
||||||
|
{0x1F5FB, 0x1F64F}, {0x1F680, 0x1F6C5}, {0x1F6CC, 0x1F6CC},
|
||||||
|
{0x1F6D0, 0x1F6D2}, {0x1F6EB, 0x1F6EC}, {0x1F6F4, 0x1F6F6},
|
||||||
|
{0x1F910, 0x1F91E}, {0x1F920, 0x1F927}, {0x1F930, 0x1F930},
|
||||||
|
{0x1F933, 0x1F93E}, {0x1F940, 0x1F94B}, {0x1F950, 0x1F95E},
|
||||||
|
{0x1F980, 0x1F991}, {0x1F9C0, 0x1F9C0}, {0x20000, 0x2FFFD},
|
||||||
|
{0x30000, 0x3FFFD},
|
||||||
|
}
|
||||||
|
|
||||||
|
var ambiguous = table{
|
||||||
|
{0x00A1, 0x00A1}, {0x00A4, 0x00A4}, {0x00A7, 0x00A8},
|
||||||
|
{0x00AA, 0x00AA}, {0x00AD, 0x00AE}, {0x00B0, 0x00B4},
|
||||||
|
{0x00B6, 0x00BA}, {0x00BC, 0x00BF}, {0x00C6, 0x00C6},
|
||||||
|
{0x00D0, 0x00D0}, {0x00D7, 0x00D8}, {0x00DE, 0x00E1},
|
||||||
|
{0x00E6, 0x00E6}, {0x00E8, 0x00EA}, {0x00EC, 0x00ED},
|
||||||
|
{0x00F0, 0x00F0}, {0x00F2, 0x00F3}, {0x00F7, 0x00FA},
|
||||||
|
{0x00FC, 0x00FC}, {0x00FE, 0x00FE}, {0x0101, 0x0101},
|
||||||
|
{0x0111, 0x0111}, {0x0113, 0x0113}, {0x011B, 0x011B},
|
||||||
|
{0x0126, 0x0127}, {0x012B, 0x012B}, {0x0131, 0x0133},
|
||||||
|
{0x0138, 0x0138}, {0x013F, 0x0142}, {0x0144, 0x0144},
|
||||||
|
{0x0148, 0x014B}, {0x014D, 0x014D}, {0x0152, 0x0153},
|
||||||
|
{0x0166, 0x0167}, {0x016B, 0x016B}, {0x01CE, 0x01CE},
|
||||||
|
{0x01D0, 0x01D0}, {0x01D2, 0x01D2}, {0x01D4, 0x01D4},
|
||||||
|
{0x01D6, 0x01D6}, {0x01D8, 0x01D8}, {0x01DA, 0x01DA},
|
||||||
|
{0x01DC, 0x01DC}, {0x0251, 0x0251}, {0x0261, 0x0261},
|
||||||
|
{0x02C4, 0x02C4}, {0x02C7, 0x02C7}, {0x02C9, 0x02CB},
|
||||||
|
{0x02CD, 0x02CD}, {0x02D0, 0x02D0}, {0x02D8, 0x02DB},
|
||||||
|
{0x02DD, 0x02DD}, {0x02DF, 0x02DF}, {0x0300, 0x036F},
|
||||||
|
{0x0391, 0x03A1}, {0x03A3, 0x03A9}, {0x03B1, 0x03C1},
|
||||||
|
{0x03C3, 0x03C9}, {0x0401, 0x0401}, {0x0410, 0x044F},
|
||||||
|
{0x0451, 0x0451}, {0x2010, 0x2010}, {0x2013, 0x2016},
|
||||||
|
{0x2018, 0x2019}, {0x201C, 0x201D}, {0x2020, 0x2022},
|
||||||
|
{0x2024, 0x2027}, {0x2030, 0x2030}, {0x2032, 0x2033},
|
||||||
|
{0x2035, 0x2035}, {0x203B, 0x203B}, {0x203E, 0x203E},
|
||||||
|
{0x2074, 0x2074}, {0x207F, 0x207F}, {0x2081, 0x2084},
|
||||||
|
{0x20AC, 0x20AC}, {0x2103, 0x2103}, {0x2105, 0x2105},
|
||||||
|
{0x2109, 0x2109}, {0x2113, 0x2113}, {0x2116, 0x2116},
|
||||||
|
{0x2121, 0x2122}, {0x2126, 0x2126}, {0x212B, 0x212B},
|
||||||
|
{0x2153, 0x2154}, {0x215B, 0x215E}, {0x2160, 0x216B},
|
||||||
|
{0x2170, 0x2179}, {0x2189, 0x2189}, {0x2190, 0x2199},
|
||||||
|
{0x21B8, 0x21B9}, {0x21D2, 0x21D2}, {0x21D4, 0x21D4},
|
||||||
|
{0x21E7, 0x21E7}, {0x2200, 0x2200}, {0x2202, 0x2203},
|
||||||
|
{0x2207, 0x2208}, {0x220B, 0x220B}, {0x220F, 0x220F},
|
||||||
|
{0x2211, 0x2211}, {0x2215, 0x2215}, {0x221A, 0x221A},
|
||||||
|
{0x221D, 0x2220}, {0x2223, 0x2223}, {0x2225, 0x2225},
|
||||||
|
{0x2227, 0x222C}, {0x222E, 0x222E}, {0x2234, 0x2237},
|
||||||
|
{0x223C, 0x223D}, {0x2248, 0x2248}, {0x224C, 0x224C},
|
||||||
|
{0x2252, 0x2252}, {0x2260, 0x2261}, {0x2264, 0x2267},
|
||||||
|
{0x226A, 0x226B}, {0x226E, 0x226F}, {0x2282, 0x2283},
|
||||||
|
{0x2286, 0x2287}, {0x2295, 0x2295}, {0x2299, 0x2299},
|
||||||
|
{0x22A5, 0x22A5}, {0x22BF, 0x22BF}, {0x2312, 0x2312},
|
||||||
|
{0x2460, 0x24E9}, {0x24EB, 0x254B}, {0x2550, 0x2573},
|
||||||
|
{0x2580, 0x258F}, {0x2592, 0x2595}, {0x25A0, 0x25A1},
|
||||||
|
{0x25A3, 0x25A9}, {0x25B2, 0x25B3}, {0x25B6, 0x25B7},
|
||||||
|
{0x25BC, 0x25BD}, {0x25C0, 0x25C1}, {0x25C6, 0x25C8},
|
||||||
|
{0x25CB, 0x25CB}, {0x25CE, 0x25D1}, {0x25E2, 0x25E5},
|
||||||
|
{0x25EF, 0x25EF}, {0x2605, 0x2606}, {0x2609, 0x2609},
|
||||||
|
{0x260E, 0x260F}, {0x261C, 0x261C}, {0x261E, 0x261E},
|
||||||
|
{0x2640, 0x2640}, {0x2642, 0x2642}, {0x2660, 0x2661},
|
||||||
|
{0x2663, 0x2665}, {0x2667, 0x266A}, {0x266C, 0x266D},
|
||||||
|
{0x266F, 0x266F}, {0x269E, 0x269F}, {0x26BF, 0x26BF},
|
||||||
|
{0x26C6, 0x26CD}, {0x26CF, 0x26D3}, {0x26D5, 0x26E1},
|
||||||
|
{0x26E3, 0x26E3}, {0x26E8, 0x26E9}, {0x26EB, 0x26F1},
|
||||||
|
{0x26F4, 0x26F4}, {0x26F6, 0x26F9}, {0x26FB, 0x26FC},
|
||||||
|
{0x26FE, 0x26FF}, {0x273D, 0x273D}, {0x2776, 0x277F},
|
||||||
|
{0x2B56, 0x2B59}, {0x3248, 0x324F}, {0xE000, 0xF8FF},
|
||||||
|
{0xFE00, 0xFE0F}, {0xFFFD, 0xFFFD}, {0x1F100, 0x1F10A},
|
||||||
|
{0x1F110, 0x1F12D}, {0x1F130, 0x1F169}, {0x1F170, 0x1F18D},
|
||||||
|
{0x1F18F, 0x1F190}, {0x1F19B, 0x1F1AC}, {0xE0100, 0xE01EF},
|
||||||
|
{0xF0000, 0xFFFFD}, {0x100000, 0x10FFFD},
|
||||||
|
}
|
||||||
|
|
||||||
|
var emoji = table{
|
||||||
|
{0x203C, 0x203C}, {0x2049, 0x2049}, {0x2122, 0x2122},
|
||||||
|
{0x2139, 0x2139}, {0x2194, 0x2199}, {0x21A9, 0x21AA},
|
||||||
|
{0x231A, 0x231B}, {0x2328, 0x2328}, {0x23CF, 0x23CF},
|
||||||
|
{0x23E9, 0x23F3}, {0x23F8, 0x23FA}, {0x24C2, 0x24C2},
|
||||||
|
{0x25AA, 0x25AB}, {0x25B6, 0x25B6}, {0x25C0, 0x25C0},
|
||||||
|
{0x25FB, 0x25FE}, {0x2600, 0x2604}, {0x260E, 0x260E},
|
||||||
|
{0x2611, 0x2611}, {0x2614, 0x2615}, {0x2618, 0x2618},
|
||||||
|
{0x261D, 0x261D}, {0x2620, 0x2620}, {0x2622, 0x2623},
|
||||||
|
{0x2626, 0x2626}, {0x262A, 0x262A}, {0x262E, 0x262F},
|
||||||
|
{0x2638, 0x263A}, {0x2640, 0x2640}, {0x2642, 0x2642},
|
||||||
|
{0x2648, 0x2653}, {0x265F, 0x2660}, {0x2663, 0x2663},
|
||||||
|
{0x2665, 0x2666}, {0x2668, 0x2668}, {0x267B, 0x267B},
|
||||||
|
{0x267E, 0x267F}, {0x2692, 0x2697}, {0x2699, 0x2699},
|
||||||
|
{0x269B, 0x269C}, {0x26A0, 0x26A1}, {0x26AA, 0x26AB},
|
||||||
|
{0x26B0, 0x26B1}, {0x26BD, 0x26BE}, {0x26C4, 0x26C5},
|
||||||
|
{0x26C8, 0x26C8}, {0x26CE, 0x26CF}, {0x26D1, 0x26D1},
|
||||||
|
{0x26D3, 0x26D4}, {0x26E9, 0x26EA}, {0x26F0, 0x26F5},
|
||||||
|
{0x26F7, 0x26FA}, {0x26FD, 0x26FD}, {0x2702, 0x2702},
|
||||||
|
{0x2705, 0x2705}, {0x2708, 0x270D}, {0x270F, 0x270F},
|
||||||
|
{0x2712, 0x2712}, {0x2714, 0x2714}, {0x2716, 0x2716},
|
||||||
|
{0x271D, 0x271D}, {0x2721, 0x2721}, {0x2728, 0x2728},
|
||||||
|
{0x2733, 0x2734}, {0x2744, 0x2744}, {0x2747, 0x2747},
|
||||||
|
{0x274C, 0x274C}, {0x274E, 0x274E}, {0x2753, 0x2755},
|
||||||
|
{0x2757, 0x2757}, {0x2763, 0x2764}, {0x2795, 0x2797},
|
||||||
|
{0x27A1, 0x27A1}, {0x27B0, 0x27B0}, {0x27BF, 0x27BF},
|
||||||
|
{0x2934, 0x2935}, {0x2B05, 0x2B07}, {0x2B1B, 0x2B1C},
|
||||||
|
{0x2B50, 0x2B50}, {0x2B55, 0x2B55}, {0x3030, 0x3030},
|
||||||
|
{0x303D, 0x303D}, {0x3297, 0x3297}, {0x3299, 0x3299},
|
||||||
|
{0x1F004, 0x1F004}, {0x1F0CF, 0x1F0CF}, {0x1F170, 0x1F171},
|
||||||
|
{0x1F17E, 0x1F17F}, {0x1F18E, 0x1F18E}, {0x1F191, 0x1F19A},
|
||||||
|
{0x1F1E6, 0x1F1FF}, {0x1F201, 0x1F202}, {0x1F21A, 0x1F21A},
|
||||||
|
{0x1F22F, 0x1F22F}, {0x1F232, 0x1F23A}, {0x1F250, 0x1F251},
|
||||||
|
{0x1F300, 0x1F321}, {0x1F324, 0x1F393}, {0x1F396, 0x1F397},
|
||||||
|
{0x1F399, 0x1F39B}, {0x1F39E, 0x1F3F0}, {0x1F3F3, 0x1F3F5},
|
||||||
|
{0x1F3F7, 0x1F4FD}, {0x1F4FF, 0x1F53D}, {0x1F549, 0x1F54E},
|
||||||
|
{0x1F550, 0x1F567}, {0x1F56F, 0x1F570}, {0x1F573, 0x1F57A},
|
||||||
|
{0x1F587, 0x1F587}, {0x1F58A, 0x1F58D}, {0x1F590, 0x1F590},
|
||||||
|
{0x1F595, 0x1F596}, {0x1F5A4, 0x1F5A5}, {0x1F5A8, 0x1F5A8},
|
||||||
|
{0x1F5B1, 0x1F5B2}, {0x1F5BC, 0x1F5BC}, {0x1F5C2, 0x1F5C4},
|
||||||
|
{0x1F5D1, 0x1F5D3}, {0x1F5DC, 0x1F5DE}, {0x1F5E1, 0x1F5E1},
|
||||||
|
{0x1F5E3, 0x1F5E3}, {0x1F5E8, 0x1F5E8}, {0x1F5EF, 0x1F5EF},
|
||||||
|
{0x1F5F3, 0x1F5F3}, {0x1F5FA, 0x1F64F}, {0x1F680, 0x1F6C5},
|
||||||
|
{0x1F6CB, 0x1F6D2}, {0x1F6E0, 0x1F6E5}, {0x1F6E9, 0x1F6E9},
|
||||||
|
{0x1F6EB, 0x1F6EC}, {0x1F6F0, 0x1F6F0}, {0x1F6F3, 0x1F6F9},
|
||||||
|
{0x1F910, 0x1F93A}, {0x1F93C, 0x1F93E}, {0x1F940, 0x1F945},
|
||||||
|
{0x1F947, 0x1F970}, {0x1F973, 0x1F976}, {0x1F97A, 0x1F97A},
|
||||||
|
{0x1F97C, 0x1F9A2}, {0x1F9B0, 0x1F9B9}, {0x1F9C0, 0x1F9C2},
|
||||||
|
{0x1F9D0, 0x1F9FF},
|
||||||
|
}
|
||||||
|
|
||||||
|
var notassigned = table{
|
||||||
|
{0x0378, 0x0379}, {0x0380, 0x0383}, {0x038B, 0x038B},
|
||||||
|
{0x038D, 0x038D}, {0x03A2, 0x03A2}, {0x0530, 0x0530},
|
||||||
|
{0x0557, 0x0558}, {0x0560, 0x0560}, {0x0588, 0x0588},
|
||||||
|
{0x058B, 0x058C}, {0x0590, 0x0590}, {0x05C8, 0x05CF},
|
||||||
|
{0x05EB, 0x05EF}, {0x05F5, 0x05FF}, {0x061D, 0x061D},
|
||||||
|
{0x070E, 0x070E}, {0x074B, 0x074C}, {0x07B2, 0x07BF},
|
||||||
|
{0x07FB, 0x07FF}, {0x082E, 0x082F}, {0x083F, 0x083F},
|
||||||
|
{0x085C, 0x085D}, {0x085F, 0x089F}, {0x08B5, 0x08B5},
|
||||||
|
{0x08BE, 0x08D3}, {0x0984, 0x0984}, {0x098D, 0x098E},
|
||||||
|
{0x0991, 0x0992}, {0x09A9, 0x09A9}, {0x09B1, 0x09B1},
|
||||||
|
{0x09B3, 0x09B5}, {0x09BA, 0x09BB}, {0x09C5, 0x09C6},
|
||||||
|
{0x09C9, 0x09CA}, {0x09CF, 0x09D6}, {0x09D8, 0x09DB},
|
||||||
|
{0x09DE, 0x09DE}, {0x09E4, 0x09E5}, {0x09FC, 0x0A00},
|
||||||
|
{0x0A04, 0x0A04}, {0x0A0B, 0x0A0E}, {0x0A11, 0x0A12},
|
||||||
|
{0x0A29, 0x0A29}, {0x0A31, 0x0A31}, {0x0A34, 0x0A34},
|
||||||
|
{0x0A37, 0x0A37}, {0x0A3A, 0x0A3B}, {0x0A3D, 0x0A3D},
|
||||||
|
{0x0A43, 0x0A46}, {0x0A49, 0x0A4A}, {0x0A4E, 0x0A50},
|
||||||
|
{0x0A52, 0x0A58}, {0x0A5D, 0x0A5D}, {0x0A5F, 0x0A65},
|
||||||
|
{0x0A76, 0x0A80}, {0x0A84, 0x0A84}, {0x0A8E, 0x0A8E},
|
||||||
|
{0x0A92, 0x0A92}, {0x0AA9, 0x0AA9}, {0x0AB1, 0x0AB1},
|
||||||
|
{0x0AB4, 0x0AB4}, {0x0ABA, 0x0ABB}, {0x0AC6, 0x0AC6},
|
||||||
|
{0x0ACA, 0x0ACA}, {0x0ACE, 0x0ACF}, {0x0AD1, 0x0ADF},
|
||||||
|
{0x0AE4, 0x0AE5}, {0x0AF2, 0x0AF8}, {0x0AFA, 0x0B00},
|
||||||
|
{0x0B04, 0x0B04}, {0x0B0D, 0x0B0E}, {0x0B11, 0x0B12},
|
||||||
|
{0x0B29, 0x0B29}, {0x0B31, 0x0B31}, {0x0B34, 0x0B34},
|
||||||
|
{0x0B3A, 0x0B3B}, {0x0B45, 0x0B46}, {0x0B49, 0x0B4A},
|
||||||
|
{0x0B4E, 0x0B55}, {0x0B58, 0x0B5B}, {0x0B5E, 0x0B5E},
|
||||||
|
{0x0B64, 0x0B65}, {0x0B78, 0x0B81}, {0x0B84, 0x0B84},
|
||||||
|
{0x0B8B, 0x0B8D}, {0x0B91, 0x0B91}, {0x0B96, 0x0B98},
|
||||||
|
{0x0B9B, 0x0B9B}, {0x0B9D, 0x0B9D}, {0x0BA0, 0x0BA2},
|
||||||
|
{0x0BA5, 0x0BA7}, {0x0BAB, 0x0BAD}, {0x0BBA, 0x0BBD},
|
||||||
|
{0x0BC3, 0x0BC5}, {0x0BC9, 0x0BC9}, {0x0BCE, 0x0BCF},
|
||||||
|
{0x0BD1, 0x0BD6}, {0x0BD8, 0x0BE5}, {0x0BFB, 0x0BFF},
|
||||||
|
{0x0C04, 0x0C04}, {0x0C0D, 0x0C0D}, {0x0C11, 0x0C11},
|
||||||
|
{0x0C29, 0x0C29}, {0x0C3A, 0x0C3C}, {0x0C45, 0x0C45},
|
||||||
|
{0x0C49, 0x0C49}, {0x0C4E, 0x0C54}, {0x0C57, 0x0C57},
|
||||||
|
{0x0C5B, 0x0C5F}, {0x0C64, 0x0C65}, {0x0C70, 0x0C77},
|
||||||
|
{0x0C84, 0x0C84}, {0x0C8D, 0x0C8D}, {0x0C91, 0x0C91},
|
||||||
|
{0x0CA9, 0x0CA9}, {0x0CB4, 0x0CB4}, {0x0CBA, 0x0CBB},
|
||||||
|
{0x0CC5, 0x0CC5}, {0x0CC9, 0x0CC9}, {0x0CCE, 0x0CD4},
|
||||||
|
{0x0CD7, 0x0CDD}, {0x0CDF, 0x0CDF}, {0x0CE4, 0x0CE5},
|
||||||
|
{0x0CF0, 0x0CF0}, {0x0CF3, 0x0D00}, {0x0D04, 0x0D04},
|
||||||
|
{0x0D0D, 0x0D0D}, {0x0D11, 0x0D11}, {0x0D3B, 0x0D3C},
|
||||||
|
{0x0D45, 0x0D45}, {0x0D49, 0x0D49}, {0x0D50, 0x0D53},
|
||||||
|
{0x0D64, 0x0D65}, {0x0D80, 0x0D81}, {0x0D84, 0x0D84},
|
||||||
|
{0x0D97, 0x0D99}, {0x0DB2, 0x0DB2}, {0x0DBC, 0x0DBC},
|
||||||
|
{0x0DBE, 0x0DBF}, {0x0DC7, 0x0DC9}, {0x0DCB, 0x0DCE},
|
||||||
|
{0x0DD5, 0x0DD5}, {0x0DD7, 0x0DD7}, {0x0DE0, 0x0DE5},
|
||||||
|
{0x0DF0, 0x0DF1}, {0x0DF5, 0x0E00}, {0x0E3B, 0x0E3E},
|
||||||
|
{0x0E5C, 0x0E80}, {0x0E83, 0x0E83}, {0x0E85, 0x0E86},
|
||||||
|
{0x0E89, 0x0E89}, {0x0E8B, 0x0E8C}, {0x0E8E, 0x0E93},
|
||||||
|
{0x0E98, 0x0E98}, {0x0EA0, 0x0EA0}, {0x0EA4, 0x0EA4},
|
||||||
|
{0x0EA6, 0x0EA6}, {0x0EA8, 0x0EA9}, {0x0EAC, 0x0EAC},
|
||||||
|
{0x0EBA, 0x0EBA}, {0x0EBE, 0x0EBF}, {0x0EC5, 0x0EC5},
|
||||||
|
{0x0EC7, 0x0EC7}, {0x0ECE, 0x0ECF}, {0x0EDA, 0x0EDB},
|
||||||
|
{0x0EE0, 0x0EFF}, {0x0F48, 0x0F48}, {0x0F6D, 0x0F70},
|
||||||
|
{0x0F98, 0x0F98}, {0x0FBD, 0x0FBD}, {0x0FCD, 0x0FCD},
|
||||||
|
{0x0FDB, 0x0FFF}, {0x10C6, 0x10C6}, {0x10C8, 0x10CC},
|
||||||
|
{0x10CE, 0x10CF}, {0x1249, 0x1249}, {0x124E, 0x124F},
|
||||||
|
{0x1257, 0x1257}, {0x1259, 0x1259}, {0x125E, 0x125F},
|
||||||
|
{0x1289, 0x1289}, {0x128E, 0x128F}, {0x12B1, 0x12B1},
|
||||||
|
{0x12B6, 0x12B7}, {0x12BF, 0x12BF}, {0x12C1, 0x12C1},
|
||||||
|
{0x12C6, 0x12C7}, {0x12D7, 0x12D7}, {0x1311, 0x1311},
|
||||||
|
{0x1316, 0x1317}, {0x135B, 0x135C}, {0x137D, 0x137F},
|
||||||
|
{0x139A, 0x139F}, {0x13F6, 0x13F7}, {0x13FE, 0x13FF},
|
||||||
|
{0x169D, 0x169F}, {0x16F9, 0x16FF}, {0x170D, 0x170D},
|
||||||
|
{0x1715, 0x171F}, {0x1737, 0x173F}, {0x1754, 0x175F},
|
||||||
|
{0x176D, 0x176D}, {0x1771, 0x1771}, {0x1774, 0x177F},
|
||||||
|
{0x17DE, 0x17DF}, {0x17EA, 0x17EF}, {0x17FA, 0x17FF},
|
||||||
|
{0x180F, 0x180F}, {0x181A, 0x181F}, {0x1878, 0x187F},
|
||||||
|
{0x18AB, 0x18AF}, {0x18F6, 0x18FF}, {0x191F, 0x191F},
|
||||||
|
{0x192C, 0x192F}, {0x193C, 0x193F}, {0x1941, 0x1943},
|
||||||
|
{0x196E, 0x196F}, {0x1975, 0x197F}, {0x19AC, 0x19AF},
|
||||||
|
{0x19CA, 0x19CF}, {0x19DB, 0x19DD}, {0x1A1C, 0x1A1D},
|
||||||
|
{0x1A5F, 0x1A5F}, {0x1A7D, 0x1A7E}, {0x1A8A, 0x1A8F},
|
||||||
|
{0x1A9A, 0x1A9F}, {0x1AAE, 0x1AAF}, {0x1ABF, 0x1AFF},
|
||||||
|
{0x1B4C, 0x1B4F}, {0x1B7D, 0x1B7F}, {0x1BF4, 0x1BFB},
|
||||||
|
{0x1C38, 0x1C3A}, {0x1C4A, 0x1C4C}, {0x1C89, 0x1CBF},
|
||||||
|
{0x1CC8, 0x1CCF}, {0x1CF7, 0x1CF7}, {0x1CFA, 0x1CFF},
|
||||||
|
{0x1DF6, 0x1DFA}, {0x1F16, 0x1F17}, {0x1F1E, 0x1F1F},
|
||||||
|
{0x1F46, 0x1F47}, {0x1F4E, 0x1F4F}, {0x1F58, 0x1F58},
|
||||||
|
{0x1F5A, 0x1F5A}, {0x1F5C, 0x1F5C}, {0x1F5E, 0x1F5E},
|
||||||
|
{0x1F7E, 0x1F7F}, {0x1FB5, 0x1FB5}, {0x1FC5, 0x1FC5},
|
||||||
|
{0x1FD4, 0x1FD5}, {0x1FDC, 0x1FDC}, {0x1FF0, 0x1FF1},
|
||||||
|
{0x1FF5, 0x1FF5}, {0x1FFF, 0x1FFF}, {0x2065, 0x2065},
|
||||||
|
{0x2072, 0x2073}, {0x208F, 0x208F}, {0x209D, 0x209F},
|
||||||
|
{0x20BF, 0x20CF}, {0x20F1, 0x20FF}, {0x218C, 0x218F},
|
||||||
|
{0x23FF, 0x23FF}, {0x2427, 0x243F}, {0x244B, 0x245F},
|
||||||
|
{0x2B74, 0x2B75}, {0x2B96, 0x2B97}, {0x2BBA, 0x2BBC},
|
||||||
|
{0x2BC9, 0x2BC9}, {0x2BD2, 0x2BEB}, {0x2BF0, 0x2BFF},
|
||||||
|
{0x2C2F, 0x2C2F}, {0x2C5F, 0x2C5F}, {0x2CF4, 0x2CF8},
|
||||||
|
{0x2D26, 0x2D26}, {0x2D28, 0x2D2C}, {0x2D2E, 0x2D2F},
|
||||||
|
{0x2D68, 0x2D6E}, {0x2D71, 0x2D7E}, {0x2D97, 0x2D9F},
|
||||||
|
{0x2DA7, 0x2DA7}, {0x2DAF, 0x2DAF}, {0x2DB7, 0x2DB7},
|
||||||
|
{0x2DBF, 0x2DBF}, {0x2DC7, 0x2DC7}, {0x2DCF, 0x2DCF},
|
||||||
|
{0x2DD7, 0x2DD7}, {0x2DDF, 0x2DDF}, {0x2E45, 0x2E7F},
|
||||||
|
{0x2E9A, 0x2E9A}, {0x2EF4, 0x2EFF}, {0x2FD6, 0x2FEF},
|
||||||
|
{0x2FFC, 0x2FFF}, {0x3040, 0x3040}, {0x3097, 0x3098},
|
||||||
|
{0x3100, 0x3104}, {0x312E, 0x3130}, {0x318F, 0x318F},
|
||||||
|
{0x31BB, 0x31BF}, {0x31E4, 0x31EF}, {0x321F, 0x321F},
|
||||||
|
{0x32FF, 0x32FF}, {0x4DB6, 0x4DBF}, {0x9FD6, 0x9FFF},
|
||||||
|
{0xA48D, 0xA48F}, {0xA4C7, 0xA4CF}, {0xA62C, 0xA63F},
|
||||||
|
{0xA6F8, 0xA6FF}, {0xA7AF, 0xA7AF}, {0xA7B8, 0xA7F6},
|
||||||
|
{0xA82C, 0xA82F}, {0xA83A, 0xA83F}, {0xA878, 0xA87F},
|
||||||
|
{0xA8C6, 0xA8CD}, {0xA8DA, 0xA8DF}, {0xA8FE, 0xA8FF},
|
||||||
|
{0xA954, 0xA95E}, {0xA97D, 0xA97F}, {0xA9CE, 0xA9CE},
|
||||||
|
{0xA9DA, 0xA9DD}, {0xA9FF, 0xA9FF}, {0xAA37, 0xAA3F},
|
||||||
|
{0xAA4E, 0xAA4F}, {0xAA5A, 0xAA5B}, {0xAAC3, 0xAADA},
|
||||||
|
{0xAAF7, 0xAB00}, {0xAB07, 0xAB08}, {0xAB0F, 0xAB10},
|
||||||
|
{0xAB17, 0xAB1F}, {0xAB27, 0xAB27}, {0xAB2F, 0xAB2F},
|
||||||
|
{0xAB66, 0xAB6F}, {0xABEE, 0xABEF}, {0xABFA, 0xABFF},
|
||||||
|
{0xD7A4, 0xD7AF}, {0xD7C7, 0xD7CA}, {0xD7FC, 0xD7FF},
|
||||||
|
{0xFA6E, 0xFA6F}, {0xFADA, 0xFAFF}, {0xFB07, 0xFB12},
|
||||||
|
{0xFB18, 0xFB1C}, {0xFB37, 0xFB37}, {0xFB3D, 0xFB3D},
|
||||||
|
{0xFB3F, 0xFB3F}, {0xFB42, 0xFB42}, {0xFB45, 0xFB45},
|
||||||
|
{0xFBC2, 0xFBD2}, {0xFD40, 0xFD4F}, {0xFD90, 0xFD91},
|
||||||
|
{0xFDC8, 0xFDEF}, {0xFDFE, 0xFDFF}, {0xFE1A, 0xFE1F},
|
||||||
|
{0xFE53, 0xFE53}, {0xFE67, 0xFE67}, {0xFE6C, 0xFE6F},
|
||||||
|
{0xFE75, 0xFE75}, {0xFEFD, 0xFEFE}, {0xFF00, 0xFF00},
|
||||||
|
{0xFFBF, 0xFFC1}, {0xFFC8, 0xFFC9}, {0xFFD0, 0xFFD1},
|
||||||
|
{0xFFD8, 0xFFD9}, {0xFFDD, 0xFFDF}, {0xFFE7, 0xFFE7},
|
||||||
|
{0xFFEF, 0xFFF8}, {0xFFFE, 0xFFFF}, {0x1000C, 0x1000C},
|
||||||
|
{0x10027, 0x10027}, {0x1003B, 0x1003B}, {0x1003E, 0x1003E},
|
||||||
|
{0x1004E, 0x1004F}, {0x1005E, 0x1007F}, {0x100FB, 0x100FF},
|
||||||
|
{0x10103, 0x10106}, {0x10134, 0x10136}, {0x1018F, 0x1018F},
|
||||||
|
{0x1019C, 0x1019F}, {0x101A1, 0x101CF}, {0x101FE, 0x1027F},
|
||||||
|
{0x1029D, 0x1029F}, {0x102D1, 0x102DF}, {0x102FC, 0x102FF},
|
||||||
|
{0x10324, 0x1032F}, {0x1034B, 0x1034F}, {0x1037B, 0x1037F},
|
||||||
|
{0x1039E, 0x1039E}, {0x103C4, 0x103C7}, {0x103D6, 0x103FF},
|
||||||
|
{0x1049E, 0x1049F}, {0x104AA, 0x104AF}, {0x104D4, 0x104D7},
|
||||||
|
{0x104FC, 0x104FF}, {0x10528, 0x1052F}, {0x10564, 0x1056E},
|
||||||
|
{0x10570, 0x105FF}, {0x10737, 0x1073F}, {0x10756, 0x1075F},
|
||||||
|
{0x10768, 0x107FF}, {0x10806, 0x10807}, {0x10809, 0x10809},
|
||||||
|
{0x10836, 0x10836}, {0x10839, 0x1083B}, {0x1083D, 0x1083E},
|
||||||
|
{0x10856, 0x10856}, {0x1089F, 0x108A6}, {0x108B0, 0x108DF},
|
||||||
|
{0x108F3, 0x108F3}, {0x108F6, 0x108FA}, {0x1091C, 0x1091E},
|
||||||
|
{0x1093A, 0x1093E}, {0x10940, 0x1097F}, {0x109B8, 0x109BB},
|
||||||
|
{0x109D0, 0x109D1}, {0x10A04, 0x10A04}, {0x10A07, 0x10A0B},
|
||||||
|
{0x10A14, 0x10A14}, {0x10A18, 0x10A18}, {0x10A34, 0x10A37},
|
||||||
|
{0x10A3B, 0x10A3E}, {0x10A48, 0x10A4F}, {0x10A59, 0x10A5F},
|
||||||
|
{0x10AA0, 0x10ABF}, {0x10AE7, 0x10AEA}, {0x10AF7, 0x10AFF},
|
||||||
|
{0x10B36, 0x10B38}, {0x10B56, 0x10B57}, {0x10B73, 0x10B77},
|
||||||
|
{0x10B92, 0x10B98}, {0x10B9D, 0x10BA8}, {0x10BB0, 0x10BFF},
|
||||||
|
{0x10C49, 0x10C7F}, {0x10CB3, 0x10CBF}, {0x10CF3, 0x10CF9},
|
||||||
|
{0x10D00, 0x10E5F}, {0x10E7F, 0x10FFF}, {0x1104E, 0x11051},
|
||||||
|
{0x11070, 0x1107E}, {0x110C2, 0x110CF}, {0x110E9, 0x110EF},
|
||||||
|
{0x110FA, 0x110FF}, {0x11135, 0x11135}, {0x11144, 0x1114F},
|
||||||
|
{0x11177, 0x1117F}, {0x111CE, 0x111CF}, {0x111E0, 0x111E0},
|
||||||
|
{0x111F5, 0x111FF}, {0x11212, 0x11212}, {0x1123F, 0x1127F},
|
||||||
|
{0x11287, 0x11287}, {0x11289, 0x11289}, {0x1128E, 0x1128E},
|
||||||
|
{0x1129E, 0x1129E}, {0x112AA, 0x112AF}, {0x112EB, 0x112EF},
|
||||||
|
{0x112FA, 0x112FF}, {0x11304, 0x11304}, {0x1130D, 0x1130E},
|
||||||
|
{0x11311, 0x11312}, {0x11329, 0x11329}, {0x11331, 0x11331},
|
||||||
|
{0x11334, 0x11334}, {0x1133A, 0x1133B}, {0x11345, 0x11346},
|
||||||
|
{0x11349, 0x1134A}, {0x1134E, 0x1134F}, {0x11351, 0x11356},
|
||||||
|
{0x11358, 0x1135C}, {0x11364, 0x11365}, {0x1136D, 0x1136F},
|
||||||
|
{0x11375, 0x113FF}, {0x1145A, 0x1145A}, {0x1145C, 0x1145C},
|
||||||
|
{0x1145E, 0x1147F}, {0x114C8, 0x114CF}, {0x114DA, 0x1157F},
|
||||||
|
{0x115B6, 0x115B7}, {0x115DE, 0x115FF}, {0x11645, 0x1164F},
|
||||||
|
{0x1165A, 0x1165F}, {0x1166D, 0x1167F}, {0x116B8, 0x116BF},
|
||||||
|
{0x116CA, 0x116FF}, {0x1171A, 0x1171C}, {0x1172C, 0x1172F},
|
||||||
|
{0x11740, 0x1189F}, {0x118F3, 0x118FE}, {0x11900, 0x11ABF},
|
||||||
|
{0x11AF9, 0x11BFF}, {0x11C09, 0x11C09}, {0x11C37, 0x11C37},
|
||||||
|
{0x11C46, 0x11C4F}, {0x11C6D, 0x11C6F}, {0x11C90, 0x11C91},
|
||||||
|
{0x11CA8, 0x11CA8}, {0x11CB7, 0x11FFF}, {0x1239A, 0x123FF},
|
||||||
|
{0x1246F, 0x1246F}, {0x12475, 0x1247F}, {0x12544, 0x12FFF},
|
||||||
|
{0x1342F, 0x143FF}, {0x14647, 0x167FF}, {0x16A39, 0x16A3F},
|
||||||
|
{0x16A5F, 0x16A5F}, {0x16A6A, 0x16A6D}, {0x16A70, 0x16ACF},
|
||||||
|
{0x16AEE, 0x16AEF}, {0x16AF6, 0x16AFF}, {0x16B46, 0x16B4F},
|
||||||
|
{0x16B5A, 0x16B5A}, {0x16B62, 0x16B62}, {0x16B78, 0x16B7C},
|
||||||
|
{0x16B90, 0x16EFF}, {0x16F45, 0x16F4F}, {0x16F7F, 0x16F8E},
|
||||||
|
{0x16FA0, 0x16FDF}, {0x16FE1, 0x16FFF}, {0x187ED, 0x187FF},
|
||||||
|
{0x18AF3, 0x1AFFF}, {0x1B002, 0x1BBFF}, {0x1BC6B, 0x1BC6F},
|
||||||
|
{0x1BC7D, 0x1BC7F}, {0x1BC89, 0x1BC8F}, {0x1BC9A, 0x1BC9B},
|
||||||
|
{0x1BCA4, 0x1CFFF}, {0x1D0F6, 0x1D0FF}, {0x1D127, 0x1D128},
|
||||||
|
{0x1D1E9, 0x1D1FF}, {0x1D246, 0x1D2FF}, {0x1D357, 0x1D35F},
|
||||||
|
{0x1D372, 0x1D3FF}, {0x1D455, 0x1D455}, {0x1D49D, 0x1D49D},
|
||||||
|
{0x1D4A0, 0x1D4A1}, {0x1D4A3, 0x1D4A4}, {0x1D4A7, 0x1D4A8},
|
||||||
|
{0x1D4AD, 0x1D4AD}, {0x1D4BA, 0x1D4BA}, {0x1D4BC, 0x1D4BC},
|
||||||
|
{0x1D4C4, 0x1D4C4}, {0x1D506, 0x1D506}, {0x1D50B, 0x1D50C},
|
||||||
|
{0x1D515, 0x1D515}, {0x1D51D, 0x1D51D}, {0x1D53A, 0x1D53A},
|
||||||
|
{0x1D53F, 0x1D53F}, {0x1D545, 0x1D545}, {0x1D547, 0x1D549},
|
||||||
|
{0x1D551, 0x1D551}, {0x1D6A6, 0x1D6A7}, {0x1D7CC, 0x1D7CD},
|
||||||
|
{0x1DA8C, 0x1DA9A}, {0x1DAA0, 0x1DAA0}, {0x1DAB0, 0x1DFFF},
|
||||||
|
{0x1E007, 0x1E007}, {0x1E019, 0x1E01A}, {0x1E022, 0x1E022},
|
||||||
|
{0x1E025, 0x1E025}, {0x1E02B, 0x1E7FF}, {0x1E8C5, 0x1E8C6},
|
||||||
|
{0x1E8D7, 0x1E8FF}, {0x1E94B, 0x1E94F}, {0x1E95A, 0x1E95D},
|
||||||
|
{0x1E960, 0x1EDFF}, {0x1EE04, 0x1EE04}, {0x1EE20, 0x1EE20},
|
||||||
|
{0x1EE23, 0x1EE23}, {0x1EE25, 0x1EE26}, {0x1EE28, 0x1EE28},
|
||||||
|
{0x1EE33, 0x1EE33}, {0x1EE38, 0x1EE38}, {0x1EE3A, 0x1EE3A},
|
||||||
|
{0x1EE3C, 0x1EE41}, {0x1EE43, 0x1EE46}, {0x1EE48, 0x1EE48},
|
||||||
|
{0x1EE4A, 0x1EE4A}, {0x1EE4C, 0x1EE4C}, {0x1EE50, 0x1EE50},
|
||||||
|
{0x1EE53, 0x1EE53}, {0x1EE55, 0x1EE56}, {0x1EE58, 0x1EE58},
|
||||||
|
{0x1EE5A, 0x1EE5A}, {0x1EE5C, 0x1EE5C}, {0x1EE5E, 0x1EE5E},
|
||||||
|
{0x1EE60, 0x1EE60}, {0x1EE63, 0x1EE63}, {0x1EE65, 0x1EE66},
|
||||||
|
{0x1EE6B, 0x1EE6B}, {0x1EE73, 0x1EE73}, {0x1EE78, 0x1EE78},
|
||||||
|
{0x1EE7D, 0x1EE7D}, {0x1EE7F, 0x1EE7F}, {0x1EE8A, 0x1EE8A},
|
||||||
|
{0x1EE9C, 0x1EEA0}, {0x1EEA4, 0x1EEA4}, {0x1EEAA, 0x1EEAA},
|
||||||
|
{0x1EEBC, 0x1EEEF}, {0x1EEF2, 0x1EFFF}, {0x1F02C, 0x1F02F},
|
||||||
|
{0x1F094, 0x1F09F}, {0x1F0AF, 0x1F0B0}, {0x1F0C0, 0x1F0C0},
|
||||||
|
{0x1F0D0, 0x1F0D0}, {0x1F0F6, 0x1F0FF}, {0x1F10D, 0x1F10F},
|
||||||
|
{0x1F12F, 0x1F12F}, {0x1F16C, 0x1F16F}, {0x1F1AD, 0x1F1E5},
|
||||||
|
{0x1F203, 0x1F20F}, {0x1F23C, 0x1F23F}, {0x1F249, 0x1F24F},
|
||||||
|
{0x1F252, 0x1F2FF}, {0x1F6D3, 0x1F6DF}, {0x1F6ED, 0x1F6EF},
|
||||||
|
{0x1F6F7, 0x1F6FF}, {0x1F774, 0x1F77F}, {0x1F7D5, 0x1F7FF},
|
||||||
|
{0x1F80C, 0x1F80F}, {0x1F848, 0x1F84F}, {0x1F85A, 0x1F85F},
|
||||||
|
{0x1F888, 0x1F88F}, {0x1F8AE, 0x1F90F}, {0x1F91F, 0x1F91F},
|
||||||
|
{0x1F928, 0x1F92F}, {0x1F931, 0x1F932}, {0x1F93F, 0x1F93F},
|
||||||
|
{0x1F94C, 0x1F94F}, {0x1F95F, 0x1F97F}, {0x1F992, 0x1F9BF},
|
||||||
|
{0x1F9C1, 0x1FFFF}, {0x2A6D7, 0x2A6FF}, {0x2B735, 0x2B73F},
|
||||||
|
{0x2B81E, 0x2B81F}, {0x2CEA2, 0x2F7FF}, {0x2FA1E, 0xE0000},
|
||||||
|
{0xE0002, 0xE001F}, {0xE0080, 0xE00FF}, {0xE01F0, 0xEFFFF},
|
||||||
|
{0xFFFFE, 0xFFFFF},
|
||||||
|
}
|
||||||
|
|
||||||
|
var neutral = table{
|
||||||
|
{0x0000, 0x001F}, {0x007F, 0x00A0}, {0x00A9, 0x00A9},
|
||||||
|
{0x00AB, 0x00AB}, {0x00B5, 0x00B5}, {0x00BB, 0x00BB},
|
||||||
|
{0x00C0, 0x00C5}, {0x00C7, 0x00CF}, {0x00D1, 0x00D6},
|
||||||
|
{0x00D9, 0x00DD}, {0x00E2, 0x00E5}, {0x00E7, 0x00E7},
|
||||||
|
{0x00EB, 0x00EB}, {0x00EE, 0x00EF}, {0x00F1, 0x00F1},
|
||||||
|
{0x00F4, 0x00F6}, {0x00FB, 0x00FB}, {0x00FD, 0x00FD},
|
||||||
|
{0x00FF, 0x0100}, {0x0102, 0x0110}, {0x0112, 0x0112},
|
||||||
|
{0x0114, 0x011A}, {0x011C, 0x0125}, {0x0128, 0x012A},
|
||||||
|
{0x012C, 0x0130}, {0x0134, 0x0137}, {0x0139, 0x013E},
|
||||||
|
{0x0143, 0x0143}, {0x0145, 0x0147}, {0x014C, 0x014C},
|
||||||
|
{0x014E, 0x0151}, {0x0154, 0x0165}, {0x0168, 0x016A},
|
||||||
|
{0x016C, 0x01CD}, {0x01CF, 0x01CF}, {0x01D1, 0x01D1},
|
||||||
|
{0x01D3, 0x01D3}, {0x01D5, 0x01D5}, {0x01D7, 0x01D7},
|
||||||
|
{0x01D9, 0x01D9}, {0x01DB, 0x01DB}, {0x01DD, 0x0250},
|
||||||
|
{0x0252, 0x0260}, {0x0262, 0x02C3}, {0x02C5, 0x02C6},
|
||||||
|
{0x02C8, 0x02C8}, {0x02CC, 0x02CC}, {0x02CE, 0x02CF},
|
||||||
|
{0x02D1, 0x02D7}, {0x02DC, 0x02DC}, {0x02DE, 0x02DE},
|
||||||
|
{0x02E0, 0x02FF}, {0x0370, 0x0377}, {0x037A, 0x037F},
|
||||||
|
{0x0384, 0x038A}, {0x038C, 0x038C}, {0x038E, 0x0390},
|
||||||
|
{0x03AA, 0x03B0}, {0x03C2, 0x03C2}, {0x03CA, 0x0400},
|
||||||
|
{0x0402, 0x040F}, {0x0450, 0x0450}, {0x0452, 0x052F},
|
||||||
|
{0x0531, 0x0556}, {0x0559, 0x055F}, {0x0561, 0x0587},
|
||||||
|
{0x0589, 0x058A}, {0x058D, 0x058F}, {0x0591, 0x05C7},
|
||||||
|
{0x05D0, 0x05EA}, {0x05F0, 0x05F4}, {0x0600, 0x061C},
|
||||||
|
{0x061E, 0x070D}, {0x070F, 0x074A}, {0x074D, 0x07B1},
|
||||||
|
{0x07C0, 0x07FA}, {0x0800, 0x082D}, {0x0830, 0x083E},
|
||||||
|
{0x0840, 0x085B}, {0x085E, 0x085E}, {0x08A0, 0x08B4},
|
||||||
|
{0x08B6, 0x08BD}, {0x08D4, 0x0983}, {0x0985, 0x098C},
|
||||||
|
{0x098F, 0x0990}, {0x0993, 0x09A8}, {0x09AA, 0x09B0},
|
||||||
|
{0x09B2, 0x09B2}, {0x09B6, 0x09B9}, {0x09BC, 0x09C4},
|
||||||
|
{0x09C7, 0x09C8}, {0x09CB, 0x09CE}, {0x09D7, 0x09D7},
|
||||||
|
{0x09DC, 0x09DD}, {0x09DF, 0x09E3}, {0x09E6, 0x09FB},
|
||||||
|
{0x0A01, 0x0A03}, {0x0A05, 0x0A0A}, {0x0A0F, 0x0A10},
|
||||||
|
{0x0A13, 0x0A28}, {0x0A2A, 0x0A30}, {0x0A32, 0x0A33},
|
||||||
|
{0x0A35, 0x0A36}, {0x0A38, 0x0A39}, {0x0A3C, 0x0A3C},
|
||||||
|
{0x0A3E, 0x0A42}, {0x0A47, 0x0A48}, {0x0A4B, 0x0A4D},
|
||||||
|
{0x0A51, 0x0A51}, {0x0A59, 0x0A5C}, {0x0A5E, 0x0A5E},
|
||||||
|
{0x0A66, 0x0A75}, {0x0A81, 0x0A83}, {0x0A85, 0x0A8D},
|
||||||
|
{0x0A8F, 0x0A91}, {0x0A93, 0x0AA8}, {0x0AAA, 0x0AB0},
|
||||||
|
{0x0AB2, 0x0AB3}, {0x0AB5, 0x0AB9}, {0x0ABC, 0x0AC5},
|
||||||
|
{0x0AC7, 0x0AC9}, {0x0ACB, 0x0ACD}, {0x0AD0, 0x0AD0},
|
||||||
|
{0x0AE0, 0x0AE3}, {0x0AE6, 0x0AF1}, {0x0AF9, 0x0AF9},
|
||||||
|
{0x0B01, 0x0B03}, {0x0B05, 0x0B0C}, {0x0B0F, 0x0B10},
|
||||||
|
{0x0B13, 0x0B28}, {0x0B2A, 0x0B30}, {0x0B32, 0x0B33},
|
||||||
|
{0x0B35, 0x0B39}, {0x0B3C, 0x0B44}, {0x0B47, 0x0B48},
|
||||||
|
{0x0B4B, 0x0B4D}, {0x0B56, 0x0B57}, {0x0B5C, 0x0B5D},
|
||||||
|
{0x0B5F, 0x0B63}, {0x0B66, 0x0B77}, {0x0B82, 0x0B83},
|
||||||
|
{0x0B85, 0x0B8A}, {0x0B8E, 0x0B90}, {0x0B92, 0x0B95},
|
||||||
|
{0x0B99, 0x0B9A}, {0x0B9C, 0x0B9C}, {0x0B9E, 0x0B9F},
|
||||||
|
{0x0BA3, 0x0BA4}, {0x0BA8, 0x0BAA}, {0x0BAE, 0x0BB9},
|
||||||
|
{0x0BBE, 0x0BC2}, {0x0BC6, 0x0BC8}, {0x0BCA, 0x0BCD},
|
||||||
|
{0x0BD0, 0x0BD0}, {0x0BD7, 0x0BD7}, {0x0BE6, 0x0BFA},
|
||||||
|
{0x0C00, 0x0C03}, {0x0C05, 0x0C0C}, {0x0C0E, 0x0C10},
|
||||||
|
{0x0C12, 0x0C28}, {0x0C2A, 0x0C39}, {0x0C3D, 0x0C44},
|
||||||
|
{0x0C46, 0x0C48}, {0x0C4A, 0x0C4D}, {0x0C55, 0x0C56},
|
||||||
|
{0x0C58, 0x0C5A}, {0x0C60, 0x0C63}, {0x0C66, 0x0C6F},
|
||||||
|
{0x0C78, 0x0C83}, {0x0C85, 0x0C8C}, {0x0C8E, 0x0C90},
|
||||||
|
{0x0C92, 0x0CA8}, {0x0CAA, 0x0CB3}, {0x0CB5, 0x0CB9},
|
||||||
|
{0x0CBC, 0x0CC4}, {0x0CC6, 0x0CC8}, {0x0CCA, 0x0CCD},
|
||||||
|
{0x0CD5, 0x0CD6}, {0x0CDE, 0x0CDE}, {0x0CE0, 0x0CE3},
|
||||||
|
{0x0CE6, 0x0CEF}, {0x0CF1, 0x0CF2}, {0x0D01, 0x0D03},
|
||||||
|
{0x0D05, 0x0D0C}, {0x0D0E, 0x0D10}, {0x0D12, 0x0D3A},
|
||||||
|
{0x0D3D, 0x0D44}, {0x0D46, 0x0D48}, {0x0D4A, 0x0D4F},
|
||||||
|
{0x0D54, 0x0D63}, {0x0D66, 0x0D7F}, {0x0D82, 0x0D83},
|
||||||
|
{0x0D85, 0x0D96}, {0x0D9A, 0x0DB1}, {0x0DB3, 0x0DBB},
|
||||||
|
{0x0DBD, 0x0DBD}, {0x0DC0, 0x0DC6}, {0x0DCA, 0x0DCA},
|
||||||
|
{0x0DCF, 0x0DD4}, {0x0DD6, 0x0DD6}, {0x0DD8, 0x0DDF},
|
||||||
|
{0x0DE6, 0x0DEF}, {0x0DF2, 0x0DF4}, {0x0E01, 0x0E3A},
|
||||||
|
{0x0E3F, 0x0E5B}, {0x0E81, 0x0E82}, {0x0E84, 0x0E84},
|
||||||
|
{0x0E87, 0x0E88}, {0x0E8A, 0x0E8A}, {0x0E8D, 0x0E8D},
|
||||||
|
{0x0E94, 0x0E97}, {0x0E99, 0x0E9F}, {0x0EA1, 0x0EA3},
|
||||||
|
{0x0EA5, 0x0EA5}, {0x0EA7, 0x0EA7}, {0x0EAA, 0x0EAB},
|
||||||
|
{0x0EAD, 0x0EB9}, {0x0EBB, 0x0EBD}, {0x0EC0, 0x0EC4},
|
||||||
|
{0x0EC6, 0x0EC6}, {0x0EC8, 0x0ECD}, {0x0ED0, 0x0ED9},
|
||||||
|
{0x0EDC, 0x0EDF}, {0x0F00, 0x0F47}, {0x0F49, 0x0F6C},
|
||||||
|
{0x0F71, 0x0F97}, {0x0F99, 0x0FBC}, {0x0FBE, 0x0FCC},
|
||||||
|
{0x0FCE, 0x0FDA}, {0x1000, 0x10C5}, {0x10C7, 0x10C7},
|
||||||
|
{0x10CD, 0x10CD}, {0x10D0, 0x10FF}, {0x1160, 0x1248},
|
||||||
|
{0x124A, 0x124D}, {0x1250, 0x1256}, {0x1258, 0x1258},
|
||||||
|
{0x125A, 0x125D}, {0x1260, 0x1288}, {0x128A, 0x128D},
|
||||||
|
{0x1290, 0x12B0}, {0x12B2, 0x12B5}, {0x12B8, 0x12BE},
|
||||||
|
{0x12C0, 0x12C0}, {0x12C2, 0x12C5}, {0x12C8, 0x12D6},
|
||||||
|
{0x12D8, 0x1310}, {0x1312, 0x1315}, {0x1318, 0x135A},
|
||||||
|
{0x135D, 0x137C}, {0x1380, 0x1399}, {0x13A0, 0x13F5},
|
||||||
|
{0x13F8, 0x13FD}, {0x1400, 0x169C}, {0x16A0, 0x16F8},
|
||||||
|
{0x1700, 0x170C}, {0x170E, 0x1714}, {0x1720, 0x1736},
|
||||||
|
{0x1740, 0x1753}, {0x1760, 0x176C}, {0x176E, 0x1770},
|
||||||
|
{0x1772, 0x1773}, {0x1780, 0x17DD}, {0x17E0, 0x17E9},
|
||||||
|
{0x17F0, 0x17F9}, {0x1800, 0x180E}, {0x1810, 0x1819},
|
||||||
|
{0x1820, 0x1877}, {0x1880, 0x18AA}, {0x18B0, 0x18F5},
|
||||||
|
{0x1900, 0x191E}, {0x1920, 0x192B}, {0x1930, 0x193B},
|
||||||
|
{0x1940, 0x1940}, {0x1944, 0x196D}, {0x1970, 0x1974},
|
||||||
|
{0x1980, 0x19AB}, {0x19B0, 0x19C9}, {0x19D0, 0x19DA},
|
||||||
|
{0x19DE, 0x1A1B}, {0x1A1E, 0x1A5E}, {0x1A60, 0x1A7C},
|
||||||
|
{0x1A7F, 0x1A89}, {0x1A90, 0x1A99}, {0x1AA0, 0x1AAD},
|
||||||
|
{0x1AB0, 0x1ABE}, {0x1B00, 0x1B4B}, {0x1B50, 0x1B7C},
|
||||||
|
{0x1B80, 0x1BF3}, {0x1BFC, 0x1C37}, {0x1C3B, 0x1C49},
|
||||||
|
{0x1C4D, 0x1C88}, {0x1CC0, 0x1CC7}, {0x1CD0, 0x1CF6},
|
||||||
|
{0x1CF8, 0x1CF9}, {0x1D00, 0x1DF5}, {0x1DFB, 0x1F15},
|
||||||
|
{0x1F18, 0x1F1D}, {0x1F20, 0x1F45}, {0x1F48, 0x1F4D},
|
||||||
|
{0x1F50, 0x1F57}, {0x1F59, 0x1F59}, {0x1F5B, 0x1F5B},
|
||||||
|
{0x1F5D, 0x1F5D}, {0x1F5F, 0x1F7D}, {0x1F80, 0x1FB4},
|
||||||
|
{0x1FB6, 0x1FC4}, {0x1FC6, 0x1FD3}, {0x1FD6, 0x1FDB},
|
||||||
|
{0x1FDD, 0x1FEF}, {0x1FF2, 0x1FF4}, {0x1FF6, 0x1FFE},
|
||||||
|
{0x2000, 0x200F}, {0x2011, 0x2012}, {0x2017, 0x2017},
|
||||||
|
{0x201A, 0x201B}, {0x201E, 0x201F}, {0x2023, 0x2023},
|
||||||
|
{0x2028, 0x202F}, {0x2031, 0x2031}, {0x2034, 0x2034},
|
||||||
|
{0x2036, 0x203A}, {0x203C, 0x203D}, {0x203F, 0x2064},
|
||||||
|
{0x2066, 0x2071}, {0x2075, 0x207E}, {0x2080, 0x2080},
|
||||||
|
{0x2085, 0x208E}, {0x2090, 0x209C}, {0x20A0, 0x20A8},
|
||||||
|
{0x20AA, 0x20AB}, {0x20AD, 0x20BE}, {0x20D0, 0x20F0},
|
||||||
|
{0x2100, 0x2102}, {0x2104, 0x2104}, {0x2106, 0x2108},
|
||||||
|
{0x210A, 0x2112}, {0x2114, 0x2115}, {0x2117, 0x2120},
|
||||||
|
{0x2123, 0x2125}, {0x2127, 0x212A}, {0x212C, 0x2152},
|
||||||
|
{0x2155, 0x215A}, {0x215F, 0x215F}, {0x216C, 0x216F},
|
||||||
|
{0x217A, 0x2188}, {0x218A, 0x218B}, {0x219A, 0x21B7},
|
||||||
|
{0x21BA, 0x21D1}, {0x21D3, 0x21D3}, {0x21D5, 0x21E6},
|
||||||
|
{0x21E8, 0x21FF}, {0x2201, 0x2201}, {0x2204, 0x2206},
|
||||||
|
{0x2209, 0x220A}, {0x220C, 0x220E}, {0x2210, 0x2210},
|
||||||
|
{0x2212, 0x2214}, {0x2216, 0x2219}, {0x221B, 0x221C},
|
||||||
|
{0x2221, 0x2222}, {0x2224, 0x2224}, {0x2226, 0x2226},
|
||||||
|
{0x222D, 0x222D}, {0x222F, 0x2233}, {0x2238, 0x223B},
|
||||||
|
{0x223E, 0x2247}, {0x2249, 0x224B}, {0x224D, 0x2251},
|
||||||
|
{0x2253, 0x225F}, {0x2262, 0x2263}, {0x2268, 0x2269},
|
||||||
|
{0x226C, 0x226D}, {0x2270, 0x2281}, {0x2284, 0x2285},
|
||||||
|
{0x2288, 0x2294}, {0x2296, 0x2298}, {0x229A, 0x22A4},
|
||||||
|
{0x22A6, 0x22BE}, {0x22C0, 0x2311}, {0x2313, 0x2319},
|
||||||
|
{0x231C, 0x2328}, {0x232B, 0x23E8}, {0x23ED, 0x23EF},
|
||||||
|
{0x23F1, 0x23F2}, {0x23F4, 0x23FE}, {0x2400, 0x2426},
|
||||||
|
{0x2440, 0x244A}, {0x24EA, 0x24EA}, {0x254C, 0x254F},
|
||||||
|
{0x2574, 0x257F}, {0x2590, 0x2591}, {0x2596, 0x259F},
|
||||||
|
{0x25A2, 0x25A2}, {0x25AA, 0x25B1}, {0x25B4, 0x25B5},
|
||||||
|
{0x25B8, 0x25BB}, {0x25BE, 0x25BF}, {0x25C2, 0x25C5},
|
||||||
|
{0x25C9, 0x25CA}, {0x25CC, 0x25CD}, {0x25D2, 0x25E1},
|
||||||
|
{0x25E6, 0x25EE}, {0x25F0, 0x25FC}, {0x25FF, 0x2604},
|
||||||
|
{0x2607, 0x2608}, {0x260A, 0x260D}, {0x2610, 0x2613},
|
||||||
|
{0x2616, 0x261B}, {0x261D, 0x261D}, {0x261F, 0x263F},
|
||||||
|
{0x2641, 0x2641}, {0x2643, 0x2647}, {0x2654, 0x265F},
|
||||||
|
{0x2662, 0x2662}, {0x2666, 0x2666}, {0x266B, 0x266B},
|
||||||
|
{0x266E, 0x266E}, {0x2670, 0x267E}, {0x2680, 0x2692},
|
||||||
|
{0x2694, 0x269D}, {0x26A0, 0x26A0}, {0x26A2, 0x26A9},
|
||||||
|
{0x26AC, 0x26BC}, {0x26C0, 0x26C3}, {0x26E2, 0x26E2},
|
||||||
|
{0x26E4, 0x26E7}, {0x2700, 0x2704}, {0x2706, 0x2709},
|
||||||
|
{0x270C, 0x2727}, {0x2729, 0x273C}, {0x273E, 0x274B},
|
||||||
|
{0x274D, 0x274D}, {0x274F, 0x2752}, {0x2756, 0x2756},
|
||||||
|
{0x2758, 0x2775}, {0x2780, 0x2794}, {0x2798, 0x27AF},
|
||||||
|
{0x27B1, 0x27BE}, {0x27C0, 0x27E5}, {0x27EE, 0x2984},
|
||||||
|
{0x2987, 0x2B1A}, {0x2B1D, 0x2B4F}, {0x2B51, 0x2B54},
|
||||||
|
{0x2B5A, 0x2B73}, {0x2B76, 0x2B95}, {0x2B98, 0x2BB9},
|
||||||
|
{0x2BBD, 0x2BC8}, {0x2BCA, 0x2BD1}, {0x2BEC, 0x2BEF},
|
||||||
|
{0x2C00, 0x2C2E}, {0x2C30, 0x2C5E}, {0x2C60, 0x2CF3},
|
||||||
|
{0x2CF9, 0x2D25}, {0x2D27, 0x2D27}, {0x2D2D, 0x2D2D},
|
||||||
|
{0x2D30, 0x2D67}, {0x2D6F, 0x2D70}, {0x2D7F, 0x2D96},
|
||||||
|
{0x2DA0, 0x2DA6}, {0x2DA8, 0x2DAE}, {0x2DB0, 0x2DB6},
|
||||||
|
{0x2DB8, 0x2DBE}, {0x2DC0, 0x2DC6}, {0x2DC8, 0x2DCE},
|
||||||
|
{0x2DD0, 0x2DD6}, {0x2DD8, 0x2DDE}, {0x2DE0, 0x2E44},
|
||||||
|
{0x303F, 0x303F}, {0x4DC0, 0x4DFF}, {0xA4D0, 0xA62B},
|
||||||
|
{0xA640, 0xA6F7}, {0xA700, 0xA7AE}, {0xA7B0, 0xA7B7},
|
||||||
|
{0xA7F7, 0xA82B}, {0xA830, 0xA839}, {0xA840, 0xA877},
|
||||||
|
{0xA880, 0xA8C5}, {0xA8CE, 0xA8D9}, {0xA8E0, 0xA8FD},
|
||||||
|
{0xA900, 0xA953}, {0xA95F, 0xA95F}, {0xA980, 0xA9CD},
|
||||||
|
{0xA9CF, 0xA9D9}, {0xA9DE, 0xA9FE}, {0xAA00, 0xAA36},
|
||||||
|
{0xAA40, 0xAA4D}, {0xAA50, 0xAA59}, {0xAA5C, 0xAAC2},
|
||||||
|
{0xAADB, 0xAAF6}, {0xAB01, 0xAB06}, {0xAB09, 0xAB0E},
|
||||||
|
{0xAB11, 0xAB16}, {0xAB20, 0xAB26}, {0xAB28, 0xAB2E},
|
||||||
|
{0xAB30, 0xAB65}, {0xAB70, 0xABED}, {0xABF0, 0xABF9},
|
||||||
|
{0xD7B0, 0xD7C6}, {0xD7CB, 0xD7FB}, {0xD800, 0xDFFF},
|
||||||
|
{0xFB00, 0xFB06}, {0xFB13, 0xFB17}, {0xFB1D, 0xFB36},
|
||||||
|
{0xFB38, 0xFB3C}, {0xFB3E, 0xFB3E}, {0xFB40, 0xFB41},
|
||||||
|
{0xFB43, 0xFB44}, {0xFB46, 0xFBC1}, {0xFBD3, 0xFD3F},
|
||||||
|
{0xFD50, 0xFD8F}, {0xFD92, 0xFDC7}, {0xFDF0, 0xFDFD},
|
||||||
|
{0xFE20, 0xFE2F}, {0xFE70, 0xFE74}, {0xFE76, 0xFEFC},
|
||||||
|
{0xFEFF, 0xFEFF}, {0xFFF9, 0xFFFC}, {0x10000, 0x1000B},
|
||||||
|
{0x1000D, 0x10026}, {0x10028, 0x1003A}, {0x1003C, 0x1003D},
|
||||||
|
{0x1003F, 0x1004D}, {0x10050, 0x1005D}, {0x10080, 0x100FA},
|
||||||
|
{0x10100, 0x10102}, {0x10107, 0x10133}, {0x10137, 0x1018E},
|
||||||
|
{0x10190, 0x1019B}, {0x101A0, 0x101A0}, {0x101D0, 0x101FD},
|
||||||
|
{0x10280, 0x1029C}, {0x102A0, 0x102D0}, {0x102E0, 0x102FB},
|
||||||
|
{0x10300, 0x10323}, {0x10330, 0x1034A}, {0x10350, 0x1037A},
|
||||||
|
{0x10380, 0x1039D}, {0x1039F, 0x103C3}, {0x103C8, 0x103D5},
|
||||||
|
{0x10400, 0x1049D}, {0x104A0, 0x104A9}, {0x104B0, 0x104D3},
|
||||||
|
{0x104D8, 0x104FB}, {0x10500, 0x10527}, {0x10530, 0x10563},
|
||||||
|
{0x1056F, 0x1056F}, {0x10600, 0x10736}, {0x10740, 0x10755},
|
||||||
|
{0x10760, 0x10767}, {0x10800, 0x10805}, {0x10808, 0x10808},
|
||||||
|
{0x1080A, 0x10835}, {0x10837, 0x10838}, {0x1083C, 0x1083C},
|
||||||
|
{0x1083F, 0x10855}, {0x10857, 0x1089E}, {0x108A7, 0x108AF},
|
||||||
|
{0x108E0, 0x108F2}, {0x108F4, 0x108F5}, {0x108FB, 0x1091B},
|
||||||
|
{0x1091F, 0x10939}, {0x1093F, 0x1093F}, {0x10980, 0x109B7},
|
||||||
|
{0x109BC, 0x109CF}, {0x109D2, 0x10A03}, {0x10A05, 0x10A06},
|
||||||
|
{0x10A0C, 0x10A13}, {0x10A15, 0x10A17}, {0x10A19, 0x10A33},
|
||||||
|
{0x10A38, 0x10A3A}, {0x10A3F, 0x10A47}, {0x10A50, 0x10A58},
|
||||||
|
{0x10A60, 0x10A9F}, {0x10AC0, 0x10AE6}, {0x10AEB, 0x10AF6},
|
||||||
|
{0x10B00, 0x10B35}, {0x10B39, 0x10B55}, {0x10B58, 0x10B72},
|
||||||
|
{0x10B78, 0x10B91}, {0x10B99, 0x10B9C}, {0x10BA9, 0x10BAF},
|
||||||
|
{0x10C00, 0x10C48}, {0x10C80, 0x10CB2}, {0x10CC0, 0x10CF2},
|
||||||
|
{0x10CFA, 0x10CFF}, {0x10E60, 0x10E7E}, {0x11000, 0x1104D},
|
||||||
|
{0x11052, 0x1106F}, {0x1107F, 0x110C1}, {0x110D0, 0x110E8},
|
||||||
|
{0x110F0, 0x110F9}, {0x11100, 0x11134}, {0x11136, 0x11143},
|
||||||
|
{0x11150, 0x11176}, {0x11180, 0x111CD}, {0x111D0, 0x111DF},
|
||||||
|
{0x111E1, 0x111F4}, {0x11200, 0x11211}, {0x11213, 0x1123E},
|
||||||
|
{0x11280, 0x11286}, {0x11288, 0x11288}, {0x1128A, 0x1128D},
|
||||||
|
{0x1128F, 0x1129D}, {0x1129F, 0x112A9}, {0x112B0, 0x112EA},
|
||||||
|
{0x112F0, 0x112F9}, {0x11300, 0x11303}, {0x11305, 0x1130C},
|
||||||
|
{0x1130F, 0x11310}, {0x11313, 0x11328}, {0x1132A, 0x11330},
|
||||||
|
{0x11332, 0x11333}, {0x11335, 0x11339}, {0x1133C, 0x11344},
|
||||||
|
{0x11347, 0x11348}, {0x1134B, 0x1134D}, {0x11350, 0x11350},
|
||||||
|
{0x11357, 0x11357}, {0x1135D, 0x11363}, {0x11366, 0x1136C},
|
||||||
|
{0x11370, 0x11374}, {0x11400, 0x11459}, {0x1145B, 0x1145B},
|
||||||
|
{0x1145D, 0x1145D}, {0x11480, 0x114C7}, {0x114D0, 0x114D9},
|
||||||
|
{0x11580, 0x115B5}, {0x115B8, 0x115DD}, {0x11600, 0x11644},
|
||||||
|
{0x11650, 0x11659}, {0x11660, 0x1166C}, {0x11680, 0x116B7},
|
||||||
|
{0x116C0, 0x116C9}, {0x11700, 0x11719}, {0x1171D, 0x1172B},
|
||||||
|
{0x11730, 0x1173F}, {0x118A0, 0x118F2}, {0x118FF, 0x118FF},
|
||||||
|
{0x11AC0, 0x11AF8}, {0x11C00, 0x11C08}, {0x11C0A, 0x11C36},
|
||||||
|
{0x11C38, 0x11C45}, {0x11C50, 0x11C6C}, {0x11C70, 0x11C8F},
|
||||||
|
{0x11C92, 0x11CA7}, {0x11CA9, 0x11CB6}, {0x12000, 0x12399},
|
||||||
|
{0x12400, 0x1246E}, {0x12470, 0x12474}, {0x12480, 0x12543},
|
||||||
|
{0x13000, 0x1342E}, {0x14400, 0x14646}, {0x16800, 0x16A38},
|
||||||
|
{0x16A40, 0x16A5E}, {0x16A60, 0x16A69}, {0x16A6E, 0x16A6F},
|
||||||
|
{0x16AD0, 0x16AED}, {0x16AF0, 0x16AF5}, {0x16B00, 0x16B45},
|
||||||
|
{0x16B50, 0x16B59}, {0x16B5B, 0x16B61}, {0x16B63, 0x16B77},
|
||||||
|
{0x16B7D, 0x16B8F}, {0x16F00, 0x16F44}, {0x16F50, 0x16F7E},
|
||||||
|
{0x16F8F, 0x16F9F}, {0x1BC00, 0x1BC6A}, {0x1BC70, 0x1BC7C},
|
||||||
|
{0x1BC80, 0x1BC88}, {0x1BC90, 0x1BC99}, {0x1BC9C, 0x1BCA3},
|
||||||
|
{0x1D000, 0x1D0F5}, {0x1D100, 0x1D126}, {0x1D129, 0x1D1E8},
|
||||||
|
{0x1D200, 0x1D245}, {0x1D300, 0x1D356}, {0x1D360, 0x1D371},
|
||||||
|
{0x1D400, 0x1D454}, {0x1D456, 0x1D49C}, {0x1D49E, 0x1D49F},
|
||||||
|
{0x1D4A2, 0x1D4A2}, {0x1D4A5, 0x1D4A6}, {0x1D4A9, 0x1D4AC},
|
||||||
|
{0x1D4AE, 0x1D4B9}, {0x1D4BB, 0x1D4BB}, {0x1D4BD, 0x1D4C3},
|
||||||
|
{0x1D4C5, 0x1D505}, {0x1D507, 0x1D50A}, {0x1D50D, 0x1D514},
|
||||||
|
{0x1D516, 0x1D51C}, {0x1D51E, 0x1D539}, {0x1D53B, 0x1D53E},
|
||||||
|
{0x1D540, 0x1D544}, {0x1D546, 0x1D546}, {0x1D54A, 0x1D550},
|
||||||
|
{0x1D552, 0x1D6A5}, {0x1D6A8, 0x1D7CB}, {0x1D7CE, 0x1DA8B},
|
||||||
|
{0x1DA9B, 0x1DA9F}, {0x1DAA1, 0x1DAAF}, {0x1E000, 0x1E006},
|
||||||
|
{0x1E008, 0x1E018}, {0x1E01B, 0x1E021}, {0x1E023, 0x1E024},
|
||||||
|
{0x1E026, 0x1E02A}, {0x1E800, 0x1E8C4}, {0x1E8C7, 0x1E8D6},
|
||||||
|
{0x1E900, 0x1E94A}, {0x1E950, 0x1E959}, {0x1E95E, 0x1E95F},
|
||||||
|
{0x1EE00, 0x1EE03}, {0x1EE05, 0x1EE1F}, {0x1EE21, 0x1EE22},
|
||||||
|
{0x1EE24, 0x1EE24}, {0x1EE27, 0x1EE27}, {0x1EE29, 0x1EE32},
|
||||||
|
{0x1EE34, 0x1EE37}, {0x1EE39, 0x1EE39}, {0x1EE3B, 0x1EE3B},
|
||||||
|
{0x1EE42, 0x1EE42}, {0x1EE47, 0x1EE47}, {0x1EE49, 0x1EE49},
|
||||||
|
{0x1EE4B, 0x1EE4B}, {0x1EE4D, 0x1EE4F}, {0x1EE51, 0x1EE52},
|
||||||
|
{0x1EE54, 0x1EE54}, {0x1EE57, 0x1EE57}, {0x1EE59, 0x1EE59},
|
||||||
|
{0x1EE5B, 0x1EE5B}, {0x1EE5D, 0x1EE5D}, {0x1EE5F, 0x1EE5F},
|
||||||
|
{0x1EE61, 0x1EE62}, {0x1EE64, 0x1EE64}, {0x1EE67, 0x1EE6A},
|
||||||
|
{0x1EE6C, 0x1EE72}, {0x1EE74, 0x1EE77}, {0x1EE79, 0x1EE7C},
|
||||||
|
{0x1EE7E, 0x1EE7E}, {0x1EE80, 0x1EE89}, {0x1EE8B, 0x1EE9B},
|
||||||
|
{0x1EEA1, 0x1EEA3}, {0x1EEA5, 0x1EEA9}, {0x1EEAB, 0x1EEBB},
|
||||||
|
{0x1EEF0, 0x1EEF1}, {0x1F000, 0x1F003}, {0x1F005, 0x1F02B},
|
||||||
|
{0x1F030, 0x1F093}, {0x1F0A0, 0x1F0AE}, {0x1F0B1, 0x1F0BF},
|
||||||
|
{0x1F0C1, 0x1F0CE}, {0x1F0D1, 0x1F0F5}, {0x1F10B, 0x1F10C},
|
||||||
|
{0x1F12E, 0x1F12E}, {0x1F16A, 0x1F16B}, {0x1F1E6, 0x1F1FF},
|
||||||
|
{0x1F321, 0x1F32C}, {0x1F336, 0x1F336}, {0x1F37D, 0x1F37D},
|
||||||
|
{0x1F394, 0x1F39F}, {0x1F3CB, 0x1F3CE}, {0x1F3D4, 0x1F3DF},
|
||||||
|
{0x1F3F1, 0x1F3F3}, {0x1F3F5, 0x1F3F7}, {0x1F43F, 0x1F43F},
|
||||||
|
{0x1F441, 0x1F441}, {0x1F4FD, 0x1F4FE}, {0x1F53E, 0x1F54A},
|
||||||
|
{0x1F54F, 0x1F54F}, {0x1F568, 0x1F579}, {0x1F57B, 0x1F594},
|
||||||
|
{0x1F597, 0x1F5A3}, {0x1F5A5, 0x1F5FA}, {0x1F650, 0x1F67F},
|
||||||
|
{0x1F6C6, 0x1F6CB}, {0x1F6CD, 0x1F6CF}, {0x1F6E0, 0x1F6EA},
|
||||||
|
{0x1F6F0, 0x1F6F3}, {0x1F700, 0x1F773}, {0x1F780, 0x1F7D4},
|
||||||
|
{0x1F800, 0x1F80B}, {0x1F810, 0x1F847}, {0x1F850, 0x1F859},
|
||||||
|
{0x1F860, 0x1F887}, {0x1F890, 0x1F8AD}, {0xE0001, 0xE0001},
|
||||||
|
{0xE0020, 0xE007F},
|
||||||
|
}
|
||||||
|
|
||||||
|
// Condition have flag EastAsianWidth whether the current locale is CJK or not.
|
||||||
|
type Condition struct {
|
||||||
|
EastAsianWidth bool
|
||||||
|
ZeroWidthJoiner bool
|
||||||
|
}
|
||||||
|
|
||||||
|
// NewCondition return new instance of Condition which is current locale.
|
||||||
|
func NewCondition() *Condition {
|
||||||
|
return &Condition{
|
||||||
|
EastAsianWidth: EastAsianWidth,
|
||||||
|
ZeroWidthJoiner: ZeroWidthJoiner,
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// RuneWidth returns the number of cells in r.
|
||||||
|
// See http://www.unicode.org/reports/tr11/
|
||||||
|
func (c *Condition) RuneWidth(r rune) int {
|
||||||
|
switch {
|
||||||
|
case r < 0 || r > 0x10FFFF ||
|
||||||
|
inTables(r, nonprint, combining, notassigned):
|
||||||
|
return 0
|
||||||
|
case (c.EastAsianWidth && IsAmbiguousWidth(r)) ||
|
||||||
|
inTables(r, doublewidth, emoji):
|
||||||
|
return 2
|
||||||
|
default:
|
||||||
|
return 1
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
func (c *Condition) stringWidth(s string) (width int) {
|
||||||
|
for _, r := range []rune(s) {
|
||||||
|
width += c.RuneWidth(r)
|
||||||
|
}
|
||||||
|
return width
|
||||||
|
}
|
||||||
|
|
||||||
|
func (c *Condition) stringWidthZeroJoiner(s string) (width int) {
|
||||||
|
r1, r2 := rune(0), rune(0)
|
||||||
|
for _, r := range []rune(s) {
|
||||||
|
if r == 0xFE0E || r == 0xFE0F {
|
||||||
|
continue
|
||||||
|
}
|
||||||
|
w := c.RuneWidth(r)
|
||||||
|
if r2 == 0x200D && inTables(r, emoji) && inTables(r1, emoji) {
|
||||||
|
w = 0
|
||||||
|
}
|
||||||
|
width += w
|
||||||
|
r1, r2 = r2, r
|
||||||
|
}
|
||||||
|
return width
|
||||||
|
}
|
||||||
|
|
||||||
|
// StringWidth return width as you can see
|
||||||
|
func (c *Condition) StringWidth(s string) (width int) {
|
||||||
|
if c.ZeroWidthJoiner {
|
||||||
|
return c.stringWidthZeroJoiner(s)
|
||||||
|
}
|
||||||
|
return c.stringWidth(s)
|
||||||
|
}
|
||||||
|
|
||||||
|
// Truncate return string truncated with w cells
|
||||||
|
func (c *Condition) Truncate(s string, w int, tail string) string {
|
||||||
|
if c.StringWidth(s) <= w {
|
||||||
|
return s
|
||||||
|
}
|
||||||
|
r := []rune(s)
|
||||||
|
tw := c.StringWidth(tail)
|
||||||
|
w -= tw
|
||||||
|
width := 0
|
||||||
|
i := 0
|
||||||
|
for ; i < len(r); i++ {
|
||||||
|
cw := c.RuneWidth(r[i])
|
||||||
|
if width+cw > w {
|
||||||
|
break
|
||||||
|
}
|
||||||
|
width += cw
|
||||||
|
}
|
||||||
|
return string(r[0:i]) + tail
|
||||||
|
}
|
||||||
|
|
||||||
|
// Wrap return string wrapped with w cells
|
||||||
|
func (c *Condition) Wrap(s string, w int) string {
|
||||||
|
width := 0
|
||||||
|
out := ""
|
||||||
|
for _, r := range []rune(s) {
|
||||||
|
cw := RuneWidth(r)
|
||||||
|
if r == '\n' {
|
||||||
|
out += string(r)
|
||||||
|
width = 0
|
||||||
|
continue
|
||||||
|
} else if width+cw > w {
|
||||||
|
out += "\n"
|
||||||
|
width = 0
|
||||||
|
out += string(r)
|
||||||
|
width += cw
|
||||||
|
continue
|
||||||
|
}
|
||||||
|
out += string(r)
|
||||||
|
width += cw
|
||||||
|
}
|
||||||
|
return out
|
||||||
|
}
|
||||||
|
|
||||||
|
// FillLeft return string filled in left by spaces in w cells
|
||||||
|
func (c *Condition) FillLeft(s string, w int) string {
|
||||||
|
width := c.StringWidth(s)
|
||||||
|
count := w - width
|
||||||
|
if count > 0 {
|
||||||
|
b := make([]byte, count)
|
||||||
|
for i := range b {
|
||||||
|
b[i] = ' '
|
||||||
|
}
|
||||||
|
return string(b) + s
|
||||||
|
}
|
||||||
|
return s
|
||||||
|
}
|
||||||
|
|
||||||
|
// FillRight return string filled in left by spaces in w cells
|
||||||
|
func (c *Condition) FillRight(s string, w int) string {
|
||||||
|
width := c.StringWidth(s)
|
||||||
|
count := w - width
|
||||||
|
if count > 0 {
|
||||||
|
b := make([]byte, count)
|
||||||
|
for i := range b {
|
||||||
|
b[i] = ' '
|
||||||
|
}
|
||||||
|
return s + string(b)
|
||||||
|
}
|
||||||
|
return s
|
||||||
|
}
|
||||||
|
|
||||||
|
// RuneWidth returns the number of cells in r.
|
||||||
|
// See http://www.unicode.org/reports/tr11/
|
||||||
|
func RuneWidth(r rune) int {
|
||||||
|
return DefaultCondition.RuneWidth(r)
|
||||||
|
}
|
||||||
|
|
||||||
|
// IsAmbiguousWidth returns whether is ambiguous width or not.
|
||||||
|
func IsAmbiguousWidth(r rune) bool {
|
||||||
|
return inTables(r, private, ambiguous)
|
||||||
|
}
|
||||||
|
|
||||||
|
// IsNeutralWidth returns whether is neutral width or not.
|
||||||
|
func IsNeutralWidth(r rune) bool {
|
||||||
|
return inTable(r, neutral)
|
||||||
|
}
|
||||||
|
|
||||||
|
// StringWidth return width as you can see
|
||||||
|
func StringWidth(s string) (width int) {
|
||||||
|
return DefaultCondition.StringWidth(s)
|
||||||
|
}
|
||||||
|
|
||||||
|
// Truncate return string truncated with w cells
|
||||||
|
func Truncate(s string, w int, tail string) string {
|
||||||
|
return DefaultCondition.Truncate(s, w, tail)
|
||||||
|
}
|
||||||
|
|
||||||
|
// Wrap return string wrapped with w cells
|
||||||
|
func Wrap(s string, w int) string {
|
||||||
|
return DefaultCondition.Wrap(s, w)
|
||||||
|
}
|
||||||
|
|
||||||
|
// FillLeft return string filled in left by spaces in w cells
|
||||||
|
func FillLeft(s string, w int) string {
|
||||||
|
return DefaultCondition.FillLeft(s, w)
|
||||||
|
}
|
||||||
|
|
||||||
|
// FillRight return string filled in left by spaces in w cells
|
||||||
|
func FillRight(s string, w int) string {
|
||||||
|
return DefaultCondition.FillRight(s, w)
|
||||||
|
}
|
8
vendor/github.com/mattn/go-runewidth/runewidth_appengine.go
generated
vendored
Normal file
8
vendor/github.com/mattn/go-runewidth/runewidth_appengine.go
generated
vendored
Normal file
|
@ -0,0 +1,8 @@
|
||||||
|
// +build appengine
|
||||||
|
|
||||||
|
package runewidth
|
||||||
|
|
||||||
|
// IsEastAsian return true if the current locale is CJK
|
||||||
|
func IsEastAsian() bool {
|
||||||
|
return false
|
||||||
|
}
|
9
vendor/github.com/mattn/go-runewidth/runewidth_js.go
generated
vendored
Normal file
9
vendor/github.com/mattn/go-runewidth/runewidth_js.go
generated
vendored
Normal file
|
@ -0,0 +1,9 @@
|
||||||
|
// +build js
|
||||||
|
// +build !appengine
|
||||||
|
|
||||||
|
package runewidth
|
||||||
|
|
||||||
|
func IsEastAsian() bool {
|
||||||
|
// TODO: Implement this for the web. Detect east asian in a compatible way, and return true.
|
||||||
|
return false
|
||||||
|
}
|
79
vendor/github.com/mattn/go-runewidth/runewidth_posix.go
generated
vendored
Normal file
79
vendor/github.com/mattn/go-runewidth/runewidth_posix.go
generated
vendored
Normal file
|
@ -0,0 +1,79 @@
|
||||||
|
// +build !windows
|
||||||
|
// +build !js
|
||||||
|
// +build !appengine
|
||||||
|
|
||||||
|
package runewidth
|
||||||
|
|
||||||
|
import (
|
||||||
|
"os"
|
||||||
|
"regexp"
|
||||||
|
"strings"
|
||||||
|
)
|
||||||
|
|
||||||
|
var reLoc = regexp.MustCompile(`^[a-z][a-z][a-z]?(?:_[A-Z][A-Z])?\.(.+)`)
|
||||||
|
|
||||||
|
var mblenTable = map[string]int{
|
||||||
|
"utf-8": 6,
|
||||||
|
"utf8": 6,
|
||||||
|
"jis": 8,
|
||||||
|
"eucjp": 3,
|
||||||
|
"euckr": 2,
|
||||||
|
"euccn": 2,
|
||||||
|
"sjis": 2,
|
||||||
|
"cp932": 2,
|
||||||
|
"cp51932": 2,
|
||||||
|
"cp936": 2,
|
||||||
|
"cp949": 2,
|
||||||
|
"cp950": 2,
|
||||||
|
"big5": 2,
|
||||||
|
"gbk": 2,
|
||||||
|
"gb2312": 2,
|
||||||
|
}
|
||||||
|
|
||||||
|
func isEastAsian(locale string) bool {
|
||||||
|
charset := strings.ToLower(locale)
|
||||||
|
r := reLoc.FindStringSubmatch(locale)
|
||||||
|
if len(r) == 2 {
|
||||||
|
charset = strings.ToLower(r[1])
|
||||||
|
}
|
||||||
|
|
||||||
|
if strings.HasSuffix(charset, "@cjk_narrow") {
|
||||||
|
return false
|
||||||
|
}
|
||||||
|
|
||||||
|
for pos, b := range []byte(charset) {
|
||||||
|
if b == '@' {
|
||||||
|
charset = charset[:pos]
|
||||||
|
break
|
||||||
|
}
|
||||||
|
}
|
||||||
|
max := 1
|
||||||
|
if m, ok := mblenTable[charset]; ok {
|
||||||
|
max = m
|
||||||
|
}
|
||||||
|
if max > 1 && (charset[0] != 'u' ||
|
||||||
|
strings.HasPrefix(locale, "ja") ||
|
||||||
|
strings.HasPrefix(locale, "ko") ||
|
||||||
|
strings.HasPrefix(locale, "zh")) {
|
||||||
|
return true
|
||||||
|
}
|
||||||
|
return false
|
||||||
|
}
|
||||||
|
|
||||||
|
// IsEastAsian return true if the current locale is CJK
|
||||||
|
func IsEastAsian() bool {
|
||||||
|
locale := os.Getenv("LC_CTYPE")
|
||||||
|
if locale == "" {
|
||||||
|
locale = os.Getenv("LANG")
|
||||||
|
}
|
||||||
|
|
||||||
|
// ignore C locale
|
||||||
|
if locale == "POSIX" || locale == "C" {
|
||||||
|
return false
|
||||||
|
}
|
||||||
|
if len(locale) > 1 && locale[0] == 'C' && (locale[1] == '.' || locale[1] == '-') {
|
||||||
|
return false
|
||||||
|
}
|
||||||
|
|
||||||
|
return isEastAsian(locale)
|
||||||
|
}
|
28
vendor/github.com/mattn/go-runewidth/runewidth_windows.go
generated
vendored
Normal file
28
vendor/github.com/mattn/go-runewidth/runewidth_windows.go
generated
vendored
Normal file
|
@ -0,0 +1,28 @@
|
||||||
|
// +build windows
|
||||||
|
// +build !appengine
|
||||||
|
|
||||||
|
package runewidth
|
||||||
|
|
||||||
|
import (
|
||||||
|
"syscall"
|
||||||
|
)
|
||||||
|
|
||||||
|
var (
|
||||||
|
kernel32 = syscall.NewLazyDLL("kernel32")
|
||||||
|
procGetConsoleOutputCP = kernel32.NewProc("GetConsoleOutputCP")
|
||||||
|
)
|
||||||
|
|
||||||
|
// IsEastAsian return true if the current locale is CJK
|
||||||
|
func IsEastAsian() bool {
|
||||||
|
r1, _, _ := procGetConsoleOutputCP.Call()
|
||||||
|
if r1 == 0 {
|
||||||
|
return false
|
||||||
|
}
|
||||||
|
|
||||||
|
switch int(r1) {
|
||||||
|
case 932, 51932, 936, 949, 950:
|
||||||
|
return true
|
||||||
|
}
|
||||||
|
|
||||||
|
return false
|
||||||
|
}
|
27
vendor/github.com/mattn/go-xmpp/LICENSE
generated
vendored
Normal file
27
vendor/github.com/mattn/go-xmpp/LICENSE
generated
vendored
Normal file
|
@ -0,0 +1,27 @@
|
||||||
|
Copyright (c) 2009 The Go Authors. All rights reserved.
|
||||||
|
|
||||||
|
Redistribution and use in source and binary forms, with or without
|
||||||
|
modification, are permitted provided that the following conditions are
|
||||||
|
met:
|
||||||
|
|
||||||
|
* Redistributions of source code must retain the above copyright
|
||||||
|
notice, this list of conditions and the following disclaimer.
|
||||||
|
* Redistributions in binary form must reproduce the above
|
||||||
|
copyright notice, this list of conditions and the following disclaimer
|
||||||
|
in the documentation and/or other materials provided with the
|
||||||
|
distribution.
|
||||||
|
* Neither the name of Google Inc. nor the names of its
|
||||||
|
contributors may be used to endorse or promote products derived from
|
||||||
|
this software without specific prior written permission.
|
||||||
|
|
||||||
|
THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
|
||||||
|
"AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT
|
||||||
|
LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR
|
||||||
|
A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT
|
||||||
|
OWNER OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL,
|
||||||
|
SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT
|
||||||
|
LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE,
|
||||||
|
DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY
|
||||||
|
THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT
|
||||||
|
(INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
|
||||||
|
OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
|
6
vendor/github.com/mattn/go-xmpp/README.md
generated
vendored
Normal file
6
vendor/github.com/mattn/go-xmpp/README.md
generated
vendored
Normal file
|
@ -0,0 +1,6 @@
|
||||||
|
go-xmpp
|
||||||
|
=======
|
||||||
|
|
||||||
|
go xmpp library (original was written by russ cox )
|
||||||
|
|
||||||
|
[Documentation](https://godoc.org/github.com/mattn/go-xmpp)
|
967
vendor/github.com/mattn/go-xmpp/xmpp.go
generated
vendored
Normal file
967
vendor/github.com/mattn/go-xmpp/xmpp.go
generated
vendored
Normal file
|
@ -0,0 +1,967 @@
|
||||||
|
// Copyright 2011 The Go Authors. All rights reserved.
|
||||||
|
// Use of this source code is governed by a BSD-style
|
||||||
|
// license that can be found in the LICENSE file.
|
||||||
|
|
||||||
|
// TODO(rsc):
|
||||||
|
// More precise error handling.
|
||||||
|
// Presence functionality.
|
||||||
|
// TODO(mattn):
|
||||||
|
// Add proxy authentication.
|
||||||
|
|
||||||
|
// Package xmpp implements a simple Google Talk client
|
||||||
|
// using the XMPP protocol described in RFC 3920 and RFC 3921.
|
||||||
|
package xmpp
|
||||||
|
|
||||||
|
import (
|
||||||
|
"bufio"
|
||||||
|
"bytes"
|
||||||
|
"crypto/md5"
|
||||||
|
"crypto/rand"
|
||||||
|
"crypto/tls"
|
||||||
|
"encoding/base64"
|
||||||
|
"encoding/binary"
|
||||||
|
"encoding/xml"
|
||||||
|
"errors"
|
||||||
|
"fmt"
|
||||||
|
"io"
|
||||||
|
"math/big"
|
||||||
|
"net"
|
||||||
|
"net/http"
|
||||||
|
"net/url"
|
||||||
|
"os"
|
||||||
|
"strings"
|
||||||
|
"time"
|
||||||
|
)
|
||||||
|
|
||||||
|
const (
|
||||||
|
nsStream = "http://etherx.jabber.org/streams"
|
||||||
|
nsTLS = "urn:ietf:params:xml:ns:xmpp-tls"
|
||||||
|
nsSASL = "urn:ietf:params:xml:ns:xmpp-sasl"
|
||||||
|
nsBind = "urn:ietf:params:xml:ns:xmpp-bind"
|
||||||
|
nsClient = "jabber:client"
|
||||||
|
nsSession = "urn:ietf:params:xml:ns:xmpp-session"
|
||||||
|
)
|
||||||
|
|
||||||
|
// Default TLS configuration options
|
||||||
|
var DefaultConfig tls.Config
|
||||||
|
|
||||||
|
// DebugWriter is the writer used to write debugging output to.
|
||||||
|
var DebugWriter io.Writer = os.Stderr
|
||||||
|
|
||||||
|
// Cookie is a unique XMPP session identifier
|
||||||
|
type Cookie uint64
|
||||||
|
|
||||||
|
func getCookie() Cookie {
|
||||||
|
var buf [8]byte
|
||||||
|
if _, err := rand.Reader.Read(buf[:]); err != nil {
|
||||||
|
panic("Failed to read random bytes: " + err.Error())
|
||||||
|
}
|
||||||
|
return Cookie(binary.LittleEndian.Uint64(buf[:]))
|
||||||
|
}
|
||||||
|
|
||||||
|
// Client holds XMPP connection opitons
|
||||||
|
type Client struct {
|
||||||
|
conn net.Conn // connection to server
|
||||||
|
jid string // Jabber ID for our connection
|
||||||
|
domain string
|
||||||
|
p *xml.Decoder
|
||||||
|
}
|
||||||
|
|
||||||
|
func (c *Client) JID() string {
|
||||||
|
return c.jid
|
||||||
|
}
|
||||||
|
|
||||||
|
func containsIgnoreCase(s, substr string) bool {
|
||||||
|
s, substr = strings.ToUpper(s), strings.ToUpper(substr)
|
||||||
|
return strings.Contains(s, substr)
|
||||||
|
}
|
||||||
|
|
||||||
|
func connect(host, user, passwd string) (net.Conn, error) {
|
||||||
|
addr := host
|
||||||
|
|
||||||
|
if strings.TrimSpace(host) == "" {
|
||||||
|
a := strings.SplitN(user, "@", 2)
|
||||||
|
if len(a) == 2 {
|
||||||
|
addr = a[1]
|
||||||
|
}
|
||||||
|
}
|
||||||
|
a := strings.SplitN(host, ":", 2)
|
||||||
|
if len(a) == 1 {
|
||||||
|
addr += ":5222"
|
||||||
|
}
|
||||||
|
|
||||||
|
proxy := os.Getenv("HTTP_PROXY")
|
||||||
|
if proxy == "" {
|
||||||
|
proxy = os.Getenv("http_proxy")
|
||||||
|
}
|
||||||
|
// test for no proxy, takes a comma separated list with substrings to match
|
||||||
|
if proxy != "" {
|
||||||
|
noproxy := os.Getenv("NO_PROXY")
|
||||||
|
if noproxy == "" {
|
||||||
|
noproxy = os.Getenv("no_proxy")
|
||||||
|
}
|
||||||
|
if noproxy != "" {
|
||||||
|
nplist := strings.Split(noproxy, ",")
|
||||||
|
for _, s := range nplist {
|
||||||
|
if containsIgnoreCase(addr, s) {
|
||||||
|
proxy = ""
|
||||||
|
break
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
if proxy != "" {
|
||||||
|
url, err := url.Parse(proxy)
|
||||||
|
if err == nil {
|
||||||
|
addr = url.Host
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
c, err := net.Dial("tcp", addr)
|
||||||
|
if err != nil {
|
||||||
|
return nil, err
|
||||||
|
}
|
||||||
|
|
||||||
|
if proxy != "" {
|
||||||
|
fmt.Fprintf(c, "CONNECT %s HTTP/1.1\r\n", host)
|
||||||
|
fmt.Fprintf(c, "Host: %s\r\n", host)
|
||||||
|
fmt.Fprintf(c, "\r\n")
|
||||||
|
br := bufio.NewReader(c)
|
||||||
|
req, _ := http.NewRequest("CONNECT", host, nil)
|
||||||
|
resp, err := http.ReadResponse(br, req)
|
||||||
|
if err != nil {
|
||||||
|
return nil, err
|
||||||
|
}
|
||||||
|
if resp.StatusCode != 200 {
|
||||||
|
f := strings.SplitN(resp.Status, " ", 2)
|
||||||
|
return nil, errors.New(f[1])
|
||||||
|
}
|
||||||
|
}
|
||||||
|
return c, nil
|
||||||
|
}
|
||||||
|
|
||||||
|
// Options are used to specify additional options for new clients, such as a Resource.
|
||||||
|
type Options struct {
|
||||||
|
// Host specifies what host to connect to, as either "hostname" or "hostname:port"
|
||||||
|
// If host is not specified, the DNS SRV should be used to find the host from the domainpart of the JID.
|
||||||
|
// Default the port to 5222.
|
||||||
|
Host string
|
||||||
|
|
||||||
|
// User specifies what user to authenticate to the remote server.
|
||||||
|
User string
|
||||||
|
|
||||||
|
// Password supplies the password to use for authentication with the remote server.
|
||||||
|
Password string
|
||||||
|
|
||||||
|
// Resource specifies an XMPP client resource, like "bot", instead of accepting one
|
||||||
|
// from the server. Use "" to let the server generate one for your client.
|
||||||
|
Resource string
|
||||||
|
|
||||||
|
// OAuthScope provides go-xmpp the required scope for OAuth2 authentication.
|
||||||
|
OAuthScope string
|
||||||
|
|
||||||
|
// OAuthToken provides go-xmpp with the required OAuth2 token used to authenticate
|
||||||
|
OAuthToken string
|
||||||
|
|
||||||
|
// OAuthXmlNs provides go-xmpp with the required namespaced used for OAuth2 authentication. This is
|
||||||
|
// provided to the server as the xmlns:auth attribute of the OAuth2 authentication request.
|
||||||
|
OAuthXmlNs string
|
||||||
|
|
||||||
|
// TLS Config
|
||||||
|
TLSConfig *tls.Config
|
||||||
|
|
||||||
|
// InsecureAllowUnencryptedAuth permits authentication over a TCP connection that has not been promoted to
|
||||||
|
// TLS by STARTTLS; this could leak authentication information over the network, or permit man in the middle
|
||||||
|
// attacks.
|
||||||
|
InsecureAllowUnencryptedAuth bool
|
||||||
|
|
||||||
|
// NoTLS directs go-xmpp to not use TLS initially to contact the server; instead, a plain old unencrypted
|
||||||
|
// TCP connection should be used. (Can be combined with StartTLS to support STARTTLS-based servers.)
|
||||||
|
NoTLS bool
|
||||||
|
|
||||||
|
// StartTLS directs go-xmpp to STARTTLS if the server supports it; go-xmpp will automatically STARTTLS
|
||||||
|
// if the server requires it regardless of this option.
|
||||||
|
StartTLS bool
|
||||||
|
|
||||||
|
// Debug output
|
||||||
|
Debug bool
|
||||||
|
|
||||||
|
// Use server sessions
|
||||||
|
Session bool
|
||||||
|
|
||||||
|
// Presence Status
|
||||||
|
Status string
|
||||||
|
|
||||||
|
// Status message
|
||||||
|
StatusMessage string
|
||||||
|
}
|
||||||
|
|
||||||
|
// NewClient establishes a new Client connection based on a set of Options.
|
||||||
|
func (o Options) NewClient() (*Client, error) {
|
||||||
|
host := o.Host
|
||||||
|
c, err := connect(host, o.User, o.Password)
|
||||||
|
if err != nil {
|
||||||
|
return nil, err
|
||||||
|
}
|
||||||
|
|
||||||
|
if strings.LastIndex(o.Host, ":") > 0 {
|
||||||
|
host = host[:strings.LastIndex(o.Host, ":")]
|
||||||
|
}
|
||||||
|
|
||||||
|
client := new(Client)
|
||||||
|
if o.NoTLS {
|
||||||
|
client.conn = c
|
||||||
|
} else {
|
||||||
|
var tlsconn *tls.Conn
|
||||||
|
if o.TLSConfig != nil {
|
||||||
|
tlsconn = tls.Client(c, o.TLSConfig)
|
||||||
|
} else {
|
||||||
|
DefaultConfig.ServerName = host
|
||||||
|
newconfig := DefaultConfig
|
||||||
|
newconfig.ServerName = host
|
||||||
|
tlsconn = tls.Client(c, &newconfig)
|
||||||
|
}
|
||||||
|
if err = tlsconn.Handshake(); err != nil {
|
||||||
|
return nil, err
|
||||||
|
}
|
||||||
|
insecureSkipVerify := DefaultConfig.InsecureSkipVerify
|
||||||
|
if o.TLSConfig != nil {
|
||||||
|
insecureSkipVerify = o.TLSConfig.InsecureSkipVerify
|
||||||
|
}
|
||||||
|
if !insecureSkipVerify {
|
||||||
|
if err = tlsconn.VerifyHostname(host); err != nil {
|
||||||
|
return nil, err
|
||||||
|
}
|
||||||
|
}
|
||||||
|
client.conn = tlsconn
|
||||||
|
}
|
||||||
|
|
||||||
|
if err := client.init(&o); err != nil {
|
||||||
|
client.Close()
|
||||||
|
return nil, err
|
||||||
|
}
|
||||||
|
|
||||||
|
return client, nil
|
||||||
|
}
|
||||||
|
|
||||||
|
// NewClient creates a new connection to a host given as "hostname" or "hostname:port".
|
||||||
|
// If host is not specified, the DNS SRV should be used to find the host from the domainpart of the JID.
|
||||||
|
// Default the port to 5222.
|
||||||
|
func NewClient(host, user, passwd string, debug bool) (*Client, error) {
|
||||||
|
opts := Options{
|
||||||
|
Host: host,
|
||||||
|
User: user,
|
||||||
|
Password: passwd,
|
||||||
|
Debug: debug,
|
||||||
|
Session: false,
|
||||||
|
}
|
||||||
|
return opts.NewClient()
|
||||||
|
}
|
||||||
|
|
||||||
|
// NewClientNoTLS creates a new client without TLS
|
||||||
|
func NewClientNoTLS(host, user, passwd string, debug bool) (*Client, error) {
|
||||||
|
opts := Options{
|
||||||
|
Host: host,
|
||||||
|
User: user,
|
||||||
|
Password: passwd,
|
||||||
|
NoTLS: true,
|
||||||
|
Debug: debug,
|
||||||
|
Session: false,
|
||||||
|
}
|
||||||
|
return opts.NewClient()
|
||||||
|
}
|
||||||
|
|
||||||
|
// Close closes the XMPP connection
|
||||||
|
func (c *Client) Close() error {
|
||||||
|
if c.conn != (*tls.Conn)(nil) {
|
||||||
|
return c.conn.Close()
|
||||||
|
}
|
||||||
|
return nil
|
||||||
|
}
|
||||||
|
|
||||||
|
func saslDigestResponse(username, realm, passwd, nonce, cnonceStr, authenticate, digestURI, nonceCountStr string) string {
|
||||||
|
h := func(text string) []byte {
|
||||||
|
h := md5.New()
|
||||||
|
h.Write([]byte(text))
|
||||||
|
return h.Sum(nil)
|
||||||
|
}
|
||||||
|
hex := func(bytes []byte) string {
|
||||||
|
return fmt.Sprintf("%x", bytes)
|
||||||
|
}
|
||||||
|
kd := func(secret, data string) []byte {
|
||||||
|
return h(secret + ":" + data)
|
||||||
|
}
|
||||||
|
|
||||||
|
a1 := string(h(username+":"+realm+":"+passwd)) + ":" + nonce + ":" + cnonceStr
|
||||||
|
a2 := authenticate + ":" + digestURI
|
||||||
|
response := hex(kd(hex(h(a1)), nonce+":"+nonceCountStr+":"+cnonceStr+":auth:"+hex(h(a2))))
|
||||||
|
return response
|
||||||
|
}
|
||||||
|
|
||||||
|
func cnonce() string {
|
||||||
|
randSize := big.NewInt(0)
|
||||||
|
randSize.Lsh(big.NewInt(1), 64)
|
||||||
|
cn, err := rand.Int(rand.Reader, randSize)
|
||||||
|
if err != nil {
|
||||||
|
return ""
|
||||||
|
}
|
||||||
|
return fmt.Sprintf("%016x", cn)
|
||||||
|
}
|
||||||
|
|
||||||
|
func (c *Client) init(o *Options) error {
|
||||||
|
|
||||||
|
var domain string
|
||||||
|
var user string
|
||||||
|
a := strings.SplitN(o.User, "@", 2)
|
||||||
|
if len(o.User) > 0 {
|
||||||
|
if len(a) != 2 {
|
||||||
|
return errors.New("xmpp: invalid username (want user@domain): " + o.User)
|
||||||
|
}
|
||||||
|
user = a[0]
|
||||||
|
domain = a[1]
|
||||||
|
} // Otherwise, we'll be attempting ANONYMOUS
|
||||||
|
|
||||||
|
// Declare intent to be a jabber client and gather stream features.
|
||||||
|
f, err := c.startStream(o, domain)
|
||||||
|
if err != nil {
|
||||||
|
return err
|
||||||
|
}
|
||||||
|
|
||||||
|
// If the server requires we STARTTLS, attempt to do so.
|
||||||
|
if f, err = c.startTLSIfRequired(f, o, domain); err != nil {
|
||||||
|
return err
|
||||||
|
}
|
||||||
|
|
||||||
|
if o.User == "" && o.Password == "" {
|
||||||
|
foundAnonymous := false
|
||||||
|
for _, m := range f.Mechanisms.Mechanism {
|
||||||
|
if m == "ANONYMOUS" {
|
||||||
|
fmt.Fprintf(c.conn, "<auth xmlns='%s' mechanism='ANONYMOUS' />\n", nsSASL)
|
||||||
|
foundAnonymous = true
|
||||||
|
break
|
||||||
|
}
|
||||||
|
}
|
||||||
|
if !foundAnonymous {
|
||||||
|
return fmt.Errorf("ANONYMOUS authentication is not an option and username and password were not specified")
|
||||||
|
}
|
||||||
|
} else {
|
||||||
|
// Even digest forms of authentication are unsafe if we do not know that the host
|
||||||
|
// we are talking to is the actual server, and not a man in the middle playing
|
||||||
|
// proxy.
|
||||||
|
if !c.IsEncrypted() && !o.InsecureAllowUnencryptedAuth {
|
||||||
|
return errors.New("refusing to authenticate over unencrypted TCP connection")
|
||||||
|
}
|
||||||
|
|
||||||
|
mechanism := ""
|
||||||
|
for _, m := range f.Mechanisms.Mechanism {
|
||||||
|
if m == "X-OAUTH2" && o.OAuthToken != "" && o.OAuthScope != "" {
|
||||||
|
mechanism = m
|
||||||
|
// Oauth authentication: send base64-encoded \x00 user \x00 token.
|
||||||
|
raw := "\x00" + user + "\x00" + o.OAuthToken
|
||||||
|
enc := make([]byte, base64.StdEncoding.EncodedLen(len(raw)))
|
||||||
|
base64.StdEncoding.Encode(enc, []byte(raw))
|
||||||
|
fmt.Fprintf(c.conn, "<auth xmlns='%s' mechanism='X-OAUTH2' auth:service='oauth2' "+
|
||||||
|
"xmlns:auth='%s'>%s</auth>\n", nsSASL, o.OAuthXmlNs, enc)
|
||||||
|
break
|
||||||
|
}
|
||||||
|
if m == "PLAIN" {
|
||||||
|
mechanism = m
|
||||||
|
// Plain authentication: send base64-encoded \x00 user \x00 password.
|
||||||
|
raw := "\x00" + user + "\x00" + o.Password
|
||||||
|
enc := make([]byte, base64.StdEncoding.EncodedLen(len(raw)))
|
||||||
|
base64.StdEncoding.Encode(enc, []byte(raw))
|
||||||
|
fmt.Fprintf(c.conn, "<auth xmlns='%s' mechanism='PLAIN'>%s</auth>\n", nsSASL, enc)
|
||||||
|
break
|
||||||
|
}
|
||||||
|
if m == "DIGEST-MD5" {
|
||||||
|
mechanism = m
|
||||||
|
// Digest-MD5 authentication
|
||||||
|
fmt.Fprintf(c.conn, "<auth xmlns='%s' mechanism='DIGEST-MD5'/>\n", nsSASL)
|
||||||
|
var ch saslChallenge
|
||||||
|
if err = c.p.DecodeElement(&ch, nil); err != nil {
|
||||||
|
return errors.New("unmarshal <challenge>: " + err.Error())
|
||||||
|
}
|
||||||
|
b, err := base64.StdEncoding.DecodeString(string(ch))
|
||||||
|
if err != nil {
|
||||||
|
return err
|
||||||
|
}
|
||||||
|
tokens := map[string]string{}
|
||||||
|
for _, token := range strings.Split(string(b), ",") {
|
||||||
|
kv := strings.SplitN(strings.TrimSpace(token), "=", 2)
|
||||||
|
if len(kv) == 2 {
|
||||||
|
if kv[1][0] == '"' && kv[1][len(kv[1])-1] == '"' {
|
||||||
|
kv[1] = kv[1][1 : len(kv[1])-1]
|
||||||
|
}
|
||||||
|
tokens[kv[0]] = kv[1]
|
||||||
|
}
|
||||||
|
}
|
||||||
|
realm, _ := tokens["realm"]
|
||||||
|
nonce, _ := tokens["nonce"]
|
||||||
|
qop, _ := tokens["qop"]
|
||||||
|
charset, _ := tokens["charset"]
|
||||||
|
cnonceStr := cnonce()
|
||||||
|
digestURI := "xmpp/" + domain
|
||||||
|
nonceCount := fmt.Sprintf("%08x", 1)
|
||||||
|
digest := saslDigestResponse(user, realm, o.Password, nonce, cnonceStr, "AUTHENTICATE", digestURI, nonceCount)
|
||||||
|
message := "username=\"" + user + "\", realm=\"" + realm + "\", nonce=\"" + nonce + "\", cnonce=\"" + cnonceStr +
|
||||||
|
"\", nc=" + nonceCount + ", qop=" + qop + ", digest-uri=\"" + digestURI + "\", response=" + digest + ", charset=" + charset
|
||||||
|
|
||||||
|
fmt.Fprintf(c.conn, "<response xmlns='%s'>%s</response>\n", nsSASL, base64.StdEncoding.EncodeToString([]byte(message)))
|
||||||
|
|
||||||
|
var rspauth saslRspAuth
|
||||||
|
if err = c.p.DecodeElement(&rspauth, nil); err != nil {
|
||||||
|
return errors.New("unmarshal <challenge>: " + err.Error())
|
||||||
|
}
|
||||||
|
b, err = base64.StdEncoding.DecodeString(string(rspauth))
|
||||||
|
if err != nil {
|
||||||
|
return err
|
||||||
|
}
|
||||||
|
fmt.Fprintf(c.conn, "<response xmlns='%s'/>\n", nsSASL)
|
||||||
|
break
|
||||||
|
}
|
||||||
|
}
|
||||||
|
if mechanism == "" {
|
||||||
|
return fmt.Errorf("PLAIN authentication is not an option: %v", f.Mechanisms.Mechanism)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
// Next message should be either success or failure.
|
||||||
|
name, val, err := next(c.p)
|
||||||
|
if err != nil {
|
||||||
|
return err
|
||||||
|
}
|
||||||
|
switch v := val.(type) {
|
||||||
|
case *saslSuccess:
|
||||||
|
case *saslFailure:
|
||||||
|
errorMessage := v.Text
|
||||||
|
if errorMessage == "" {
|
||||||
|
// v.Any is type of sub-element in failure,
|
||||||
|
// which gives a description of what failed if there was no text element
|
||||||
|
errorMessage = v.Any.Local
|
||||||
|
}
|
||||||
|
return errors.New("auth failure: " + errorMessage)
|
||||||
|
default:
|
||||||
|
return errors.New("expected <success> or <failure>, got <" + name.Local + "> in " + name.Space)
|
||||||
|
}
|
||||||
|
|
||||||
|
// Now that we're authenticated, we're supposed to start the stream over again.
|
||||||
|
// Declare intent to be a jabber client.
|
||||||
|
if f, err = c.startStream(o, domain); err != nil {
|
||||||
|
return err
|
||||||
|
}
|
||||||
|
|
||||||
|
// Generate a unique cookie
|
||||||
|
cookie := getCookie()
|
||||||
|
|
||||||
|
// Send IQ message asking to bind to the local user name.
|
||||||
|
if o.Resource == "" {
|
||||||
|
fmt.Fprintf(c.conn, "<iq type='set' id='%x'><bind xmlns='%s'></bind></iq>\n", cookie, nsBind)
|
||||||
|
} else {
|
||||||
|
fmt.Fprintf(c.conn, "<iq type='set' id='%x'><bind xmlns='%s'><resource>%s</resource></bind></iq>\n", cookie, nsBind, o.Resource)
|
||||||
|
}
|
||||||
|
var iq clientIQ
|
||||||
|
if err = c.p.DecodeElement(&iq, nil); err != nil {
|
||||||
|
return errors.New("unmarshal <iq>: " + err.Error())
|
||||||
|
}
|
||||||
|
if &iq.Bind == nil {
|
||||||
|
return errors.New("<iq> result missing <bind>")
|
||||||
|
}
|
||||||
|
c.jid = iq.Bind.Jid // our local id
|
||||||
|
c.domain = domain
|
||||||
|
|
||||||
|
if o.Session {
|
||||||
|
//if server support session, open it
|
||||||
|
fmt.Fprintf(c.conn, "<iq to='%s' type='set' id='%x'><session xmlns='%s'/></iq>", xmlEscape(domain), cookie, nsSession)
|
||||||
|
}
|
||||||
|
|
||||||
|
// We're connected and can now receive and send messages.
|
||||||
|
fmt.Fprintf(c.conn, "<presence xml:lang='en'><show>%s</show><status>%s</status></presence>", o.Status, o.StatusMessage)
|
||||||
|
|
||||||
|
return nil
|
||||||
|
}
|
||||||
|
|
||||||
|
// startTlsIfRequired examines the server's stream features and, if STARTTLS is required or supported, performs the TLS handshake.
|
||||||
|
// f will be updated if the handshake completes, as the new stream's features are typically different from the original.
|
||||||
|
func (c *Client) startTLSIfRequired(f *streamFeatures, o *Options, domain string) (*streamFeatures, error) {
|
||||||
|
// whether we start tls is a matter of opinion: the server's and the user's.
|
||||||
|
switch {
|
||||||
|
case f.StartTLS == nil:
|
||||||
|
// the server does not support STARTTLS
|
||||||
|
return f, nil
|
||||||
|
case !o.StartTLS && f.StartTLS.Required == nil:
|
||||||
|
return f, nil
|
||||||
|
case f.StartTLS.Required != nil:
|
||||||
|
// the server requires STARTTLS.
|
||||||
|
case !o.StartTLS:
|
||||||
|
// the user wants STARTTLS and the server supports it.
|
||||||
|
}
|
||||||
|
var err error
|
||||||
|
|
||||||
|
fmt.Fprintf(c.conn, "<starttls xmlns='urn:ietf:params:xml:ns:xmpp-tls'/>\n")
|
||||||
|
var k tlsProceed
|
||||||
|
if err = c.p.DecodeElement(&k, nil); err != nil {
|
||||||
|
return f, errors.New("unmarshal <proceed>: " + err.Error())
|
||||||
|
}
|
||||||
|
|
||||||
|
tc := o.TLSConfig
|
||||||
|
if tc == nil {
|
||||||
|
tc = new(tls.Config)
|
||||||
|
*tc = DefaultConfig
|
||||||
|
//TODO(scott): we should consider using the server's address or reverse lookup
|
||||||
|
tc.ServerName = domain
|
||||||
|
}
|
||||||
|
t := tls.Client(c.conn, tc)
|
||||||
|
|
||||||
|
if err = t.Handshake(); err != nil {
|
||||||
|
return f, errors.New("starttls handshake: " + err.Error())
|
||||||
|
}
|
||||||
|
c.conn = t
|
||||||
|
|
||||||
|
// restart our declaration of XMPP stream intentions.
|
||||||
|
tf, err := c.startStream(o, domain)
|
||||||
|
if err != nil {
|
||||||
|
return f, err
|
||||||
|
}
|
||||||
|
return tf, nil
|
||||||
|
}
|
||||||
|
|
||||||
|
// startStream will start a new XML decoder for the connection, signal the start of a stream to the server and verify that the server has
|
||||||
|
// also started the stream; if o.Debug is true, startStream will tee decoded XML data to stderr. The features advertised by the server
|
||||||
|
// will be returned.
|
||||||
|
func (c *Client) startStream(o *Options, domain string) (*streamFeatures, error) {
|
||||||
|
if o.Debug {
|
||||||
|
c.p = xml.NewDecoder(tee{c.conn, DebugWriter})
|
||||||
|
} else {
|
||||||
|
c.p = xml.NewDecoder(c.conn)
|
||||||
|
}
|
||||||
|
|
||||||
|
_, err := fmt.Fprintf(c.conn, "<?xml version='1.0'?>\n"+
|
||||||
|
"<stream:stream to='%s' xmlns='%s'\n"+
|
||||||
|
" xmlns:stream='%s' version='1.0'>\n",
|
||||||
|
xmlEscape(domain), nsClient, nsStream)
|
||||||
|
if err != nil {
|
||||||
|
return nil, err
|
||||||
|
}
|
||||||
|
|
||||||
|
// We expect the server to start a <stream>.
|
||||||
|
se, err := nextStart(c.p)
|
||||||
|
if err != nil {
|
||||||
|
return nil, err
|
||||||
|
}
|
||||||
|
if se.Name.Space != nsStream || se.Name.Local != "stream" {
|
||||||
|
return nil, fmt.Errorf("expected <stream> but got <%v> in %v", se.Name.Local, se.Name.Space)
|
||||||
|
}
|
||||||
|
|
||||||
|
// Now we're in the stream and can use Unmarshal.
|
||||||
|
// Next message should be <features> to tell us authentication options.
|
||||||
|
// See section 4.6 in RFC 3920.
|
||||||
|
f := new(streamFeatures)
|
||||||
|
if err = c.p.DecodeElement(f, nil); err != nil {
|
||||||
|
return f, errors.New("unmarshal <features>: " + err.Error())
|
||||||
|
}
|
||||||
|
return f, nil
|
||||||
|
}
|
||||||
|
|
||||||
|
// IsEncrypted will return true if the client is connected using a TLS transport, either because it used.
|
||||||
|
// TLS to connect from the outset, or because it successfully used STARTTLS to promote a TCP connection to TLS.
|
||||||
|
func (c *Client) IsEncrypted() bool {
|
||||||
|
_, ok := c.conn.(*tls.Conn)
|
||||||
|
return ok
|
||||||
|
}
|
||||||
|
|
||||||
|
// Chat is an incoming or outgoing XMPP chat message.
|
||||||
|
type Chat struct {
|
||||||
|
Remote string
|
||||||
|
Type string
|
||||||
|
Text string
|
||||||
|
Subject string
|
||||||
|
Thread string
|
||||||
|
Roster Roster
|
||||||
|
Other []string
|
||||||
|
OtherElem []XMLElement
|
||||||
|
Stamp time.Time
|
||||||
|
}
|
||||||
|
|
||||||
|
type Roster []Contact
|
||||||
|
|
||||||
|
type Contact struct {
|
||||||
|
Remote string
|
||||||
|
Name string
|
||||||
|
Group []string
|
||||||
|
}
|
||||||
|
|
||||||
|
// Presence is an XMPP presence notification.
|
||||||
|
type Presence struct {
|
||||||
|
From string
|
||||||
|
To string
|
||||||
|
Type string
|
||||||
|
Show string
|
||||||
|
Status string
|
||||||
|
}
|
||||||
|
|
||||||
|
type IQ struct {
|
||||||
|
ID string
|
||||||
|
From string
|
||||||
|
To string
|
||||||
|
Type string
|
||||||
|
Query []byte
|
||||||
|
}
|
||||||
|
|
||||||
|
// Recv waits to receive the next XMPP stanza.
|
||||||
|
// Return type is either a presence notification or a chat message.
|
||||||
|
func (c *Client) Recv() (stanza interface{}, err error) {
|
||||||
|
for {
|
||||||
|
_, val, err := next(c.p)
|
||||||
|
if err != nil {
|
||||||
|
return Chat{}, err
|
||||||
|
}
|
||||||
|
switch v := val.(type) {
|
||||||
|
case *clientMessage:
|
||||||
|
stamp, _ := time.Parse(
|
||||||
|
"2006-01-02T15:04:05Z",
|
||||||
|
v.Delay.Stamp,
|
||||||
|
)
|
||||||
|
chat := Chat{
|
||||||
|
Remote: v.From,
|
||||||
|
Type: v.Type,
|
||||||
|
Text: v.Body,
|
||||||
|
Subject: v.Subject,
|
||||||
|
Thread: v.Thread,
|
||||||
|
Other: v.OtherStrings(),
|
||||||
|
OtherElem: v.Other,
|
||||||
|
Stamp: stamp,
|
||||||
|
}
|
||||||
|
return chat, nil
|
||||||
|
case *clientQuery:
|
||||||
|
var r Roster
|
||||||
|
for _, item := range v.Item {
|
||||||
|
r = append(r, Contact{item.Jid, item.Name, item.Group})
|
||||||
|
}
|
||||||
|
return Chat{Type: "roster", Roster: r}, nil
|
||||||
|
case *clientPresence:
|
||||||
|
return Presence{v.From, v.To, v.Type, v.Show, v.Status}, nil
|
||||||
|
case *clientIQ:
|
||||||
|
// TODO check more strictly
|
||||||
|
if bytes.Equal(bytes.TrimSpace(v.Query), []byte(`<ping xmlns='urn:xmpp:ping'/>`)) || bytes.Equal(bytes.TrimSpace(v.Query), []byte(`<ping xmlns="urn:xmpp:ping"/>`)) {
|
||||||
|
err := c.SendResultPing(v.ID, v.From)
|
||||||
|
if err != nil {
|
||||||
|
return Chat{}, err
|
||||||
|
}
|
||||||
|
}
|
||||||
|
return IQ{ID: v.ID, From: v.From, To: v.To, Type: v.Type, Query: v.Query}, nil
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// Send sends the message wrapped inside an XMPP message stanza body.
|
||||||
|
func (c *Client) Send(chat Chat) (n int, err error) {
|
||||||
|
var subtext = ``
|
||||||
|
var thdtext = ``
|
||||||
|
if chat.Subject != `` {
|
||||||
|
subtext = `<subject>` + xmlEscape(chat.Subject) + `</subject>`
|
||||||
|
}
|
||||||
|
if chat.Thread != `` {
|
||||||
|
thdtext = `<thread>` + xmlEscape(chat.Thread) + `</thread>`
|
||||||
|
}
|
||||||
|
|
||||||
|
stanza := "<message to='%s' type='%s' id='%s' xml:lang='en'>" + subtext + "<body>%s</body>" + thdtext + "</message>"
|
||||||
|
|
||||||
|
return fmt.Fprintf(c.conn, stanza,
|
||||||
|
xmlEscape(chat.Remote), xmlEscape(chat.Type), cnonce(), xmlEscape(chat.Text))
|
||||||
|
}
|
||||||
|
|
||||||
|
// SendOrg sends the original text without being wrapped in an XMPP message stanza.
|
||||||
|
func (c *Client) SendOrg(org string) (n int, err error) {
|
||||||
|
return fmt.Fprint(c.conn, org)
|
||||||
|
}
|
||||||
|
|
||||||
|
func (c *Client) SendPresence(presence Presence) (n int, err error) {
|
||||||
|
return fmt.Fprintf(c.conn, "<presence from='%s' to='%s'/>", xmlEscape(presence.From), xmlEscape(presence.To))
|
||||||
|
}
|
||||||
|
|
||||||
|
// SendKeepAlive sends a "whitespace keepalive" as described in chapter 4.6.1 of RFC6120.
|
||||||
|
func (c *Client) SendKeepAlive() (n int, err error) {
|
||||||
|
return fmt.Fprintf(c.conn, " ")
|
||||||
|
}
|
||||||
|
|
||||||
|
// SendHtml sends the message as HTML as defined by XEP-0071
|
||||||
|
func (c *Client) SendHtml(chat Chat) (n int, err error) {
|
||||||
|
return fmt.Fprintf(c.conn, "<message to='%s' type='%s' xml:lang='en'>"+
|
||||||
|
"<body>%s</body>"+
|
||||||
|
"<html xmlns='http://jabber.org/protocol/xhtml-im'><body xmlns='http://www.w3.org/1999/xhtml'>%s</body></html></message>",
|
||||||
|
xmlEscape(chat.Remote), xmlEscape(chat.Type), xmlEscape(chat.Text), chat.Text)
|
||||||
|
}
|
||||||
|
|
||||||
|
// Roster asks for the chat roster.
|
||||||
|
func (c *Client) Roster() error {
|
||||||
|
fmt.Fprintf(c.conn, "<iq from='%s' type='get' id='roster1'><query xmlns='jabber:iq:roster'/></iq>\n", xmlEscape(c.jid))
|
||||||
|
return nil
|
||||||
|
}
|
||||||
|
|
||||||
|
// RFC 3920 C.1 Streams name space
|
||||||
|
type streamFeatures struct {
|
||||||
|
XMLName xml.Name `xml:"http://etherx.jabber.org/streams features"`
|
||||||
|
StartTLS *tlsStartTLS
|
||||||
|
Mechanisms saslMechanisms
|
||||||
|
Bind bindBind
|
||||||
|
Session bool
|
||||||
|
}
|
||||||
|
|
||||||
|
type streamError struct {
|
||||||
|
XMLName xml.Name `xml:"http://etherx.jabber.org/streams error"`
|
||||||
|
Any xml.Name
|
||||||
|
Text string
|
||||||
|
}
|
||||||
|
|
||||||
|
// RFC 3920 C.3 TLS name space
|
||||||
|
type tlsStartTLS struct {
|
||||||
|
XMLName xml.Name `xml:"urn:ietf:params:xml:ns:xmpp-tls starttls"`
|
||||||
|
Required *string `xml:"required"`
|
||||||
|
}
|
||||||
|
|
||||||
|
type tlsProceed struct {
|
||||||
|
XMLName xml.Name `xml:"urn:ietf:params:xml:ns:xmpp-tls proceed"`
|
||||||
|
}
|
||||||
|
|
||||||
|
type tlsFailure struct {
|
||||||
|
XMLName xml.Name `xml:"urn:ietf:params:xml:ns:xmpp-tls failure"`
|
||||||
|
}
|
||||||
|
|
||||||
|
// RFC 3920 C.4 SASL name space
|
||||||
|
type saslMechanisms struct {
|
||||||
|
XMLName xml.Name `xml:"urn:ietf:params:xml:ns:xmpp-sasl mechanisms"`
|
||||||
|
Mechanism []string `xml:"mechanism"`
|
||||||
|
}
|
||||||
|
|
||||||
|
type saslAuth struct {
|
||||||
|
XMLName xml.Name `xml:"urn:ietf:params:xml:ns:xmpp-sasl auth"`
|
||||||
|
Mechanism string `xml:",attr"`
|
||||||
|
}
|
||||||
|
|
||||||
|
type saslChallenge string
|
||||||
|
|
||||||
|
type saslRspAuth string
|
||||||
|
|
||||||
|
type saslResponse string
|
||||||
|
|
||||||
|
type saslAbort struct {
|
||||||
|
XMLName xml.Name `xml:"urn:ietf:params:xml:ns:xmpp-sasl abort"`
|
||||||
|
}
|
||||||
|
|
||||||
|
type saslSuccess struct {
|
||||||
|
XMLName xml.Name `xml:"urn:ietf:params:xml:ns:xmpp-sasl success"`
|
||||||
|
}
|
||||||
|
|
||||||
|
type saslFailure struct {
|
||||||
|
XMLName xml.Name `xml:"urn:ietf:params:xml:ns:xmpp-sasl failure"`
|
||||||
|
Any xml.Name `xml:",any"`
|
||||||
|
Text string `xml:"text"`
|
||||||
|
}
|
||||||
|
|
||||||
|
// RFC 3920 C.5 Resource binding name space
|
||||||
|
type bindBind struct {
|
||||||
|
XMLName xml.Name `xml:"urn:ietf:params:xml:ns:xmpp-bind bind"`
|
||||||
|
Resource string
|
||||||
|
Jid string `xml:"jid"`
|
||||||
|
}
|
||||||
|
|
||||||
|
// RFC 3921 B.1 jabber:client
|
||||||
|
type clientMessage struct {
|
||||||
|
XMLName xml.Name `xml:"jabber:client message"`
|
||||||
|
From string `xml:"from,attr"`
|
||||||
|
ID string `xml:"id,attr"`
|
||||||
|
To string `xml:"to,attr"`
|
||||||
|
Type string `xml:"type,attr"` // chat, error, groupchat, headline, or normal
|
||||||
|
|
||||||
|
// These should technically be []clientText, but string is much more convenient.
|
||||||
|
Subject string `xml:"subject"`
|
||||||
|
Body string `xml:"body"`
|
||||||
|
Thread string `xml:"thread"`
|
||||||
|
|
||||||
|
// Any hasn't matched element
|
||||||
|
Other []XMLElement `xml:",any"`
|
||||||
|
|
||||||
|
Delay Delay `xml:"delay"`
|
||||||
|
}
|
||||||
|
|
||||||
|
func (m *clientMessage) OtherStrings() []string {
|
||||||
|
a := make([]string, len(m.Other))
|
||||||
|
for i, e := range m.Other {
|
||||||
|
a[i] = e.String()
|
||||||
|
}
|
||||||
|
return a
|
||||||
|
}
|
||||||
|
|
||||||
|
type XMLElement struct {
|
||||||
|
XMLName xml.Name
|
||||||
|
InnerXML string `xml:",innerxml"`
|
||||||
|
}
|
||||||
|
|
||||||
|
func (e *XMLElement) String() string {
|
||||||
|
r := bytes.NewReader([]byte(e.InnerXML))
|
||||||
|
d := xml.NewDecoder(r)
|
||||||
|
var buf bytes.Buffer
|
||||||
|
for {
|
||||||
|
tok, err := d.Token()
|
||||||
|
if err != nil {
|
||||||
|
break
|
||||||
|
}
|
||||||
|
switch v := tok.(type) {
|
||||||
|
case xml.StartElement:
|
||||||
|
err = d.Skip()
|
||||||
|
case xml.CharData:
|
||||||
|
_, err = buf.Write(v)
|
||||||
|
}
|
||||||
|
if err != nil {
|
||||||
|
break
|
||||||
|
}
|
||||||
|
}
|
||||||
|
return buf.String()
|
||||||
|
}
|
||||||
|
|
||||||
|
type Delay struct {
|
||||||
|
Stamp string `xml:"stamp,attr"`
|
||||||
|
}
|
||||||
|
|
||||||
|
type clientText struct {
|
||||||
|
Lang string `xml:",attr"`
|
||||||
|
Body string `xml:"chardata"`
|
||||||
|
}
|
||||||
|
|
||||||
|
type clientPresence struct {
|
||||||
|
XMLName xml.Name `xml:"jabber:client presence"`
|
||||||
|
From string `xml:"from,attr"`
|
||||||
|
ID string `xml:"id,attr"`
|
||||||
|
To string `xml:"to,attr"`
|
||||||
|
Type string `xml:"type,attr"` // error, probe, subscribe, subscribed, unavailable, unsubscribe, unsubscribed
|
||||||
|
Lang string `xml:"lang,attr"`
|
||||||
|
|
||||||
|
Show string `xml:"show"` // away, chat, dnd, xa
|
||||||
|
Status string `xml:"status"` // sb []clientText
|
||||||
|
Priority string `xml:"priority,attr"`
|
||||||
|
Error *clientError
|
||||||
|
}
|
||||||
|
|
||||||
|
type clientIQ struct {
|
||||||
|
// info/query
|
||||||
|
XMLName xml.Name `xml:"jabber:client iq"`
|
||||||
|
From string `xml:"from,attr"`
|
||||||
|
ID string `xml:"id,attr"`
|
||||||
|
To string `xml:"to,attr"`
|
||||||
|
Type string `xml:"type,attr"` // error, get, result, set
|
||||||
|
Query []byte `xml:",innerxml"`
|
||||||
|
Error clientError
|
||||||
|
Bind bindBind
|
||||||
|
}
|
||||||
|
|
||||||
|
type clientError struct {
|
||||||
|
XMLName xml.Name `xml:"jabber:client error"`
|
||||||
|
Code string `xml:",attr"`
|
||||||
|
Type string `xml:",attr"`
|
||||||
|
Any xml.Name
|
||||||
|
Text string
|
||||||
|
}
|
||||||
|
|
||||||
|
type clientQuery struct {
|
||||||
|
Item []rosterItem
|
||||||
|
}
|
||||||
|
|
||||||
|
type rosterItem struct {
|
||||||
|
XMLName xml.Name `xml:"jabber:iq:roster item"`
|
||||||
|
Jid string `xml:",attr"`
|
||||||
|
Name string `xml:",attr"`
|
||||||
|
Subscription string `xml:",attr"`
|
||||||
|
Group []string
|
||||||
|
}
|
||||||
|
|
||||||
|
// Scan XML token stream to find next StartElement.
|
||||||
|
func nextStart(p *xml.Decoder) (xml.StartElement, error) {
|
||||||
|
for {
|
||||||
|
t, err := p.Token()
|
||||||
|
if err != nil || t == nil {
|
||||||
|
return xml.StartElement{}, err
|
||||||
|
}
|
||||||
|
switch t := t.(type) {
|
||||||
|
case xml.StartElement:
|
||||||
|
return t, nil
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// Scan XML token stream for next element and save into val.
|
||||||
|
// If val == nil, allocate new element based on proto map.
|
||||||
|
// Either way, return val.
|
||||||
|
func next(p *xml.Decoder) (xml.Name, interface{}, error) {
|
||||||
|
// Read start element to find out what type we want.
|
||||||
|
se, err := nextStart(p)
|
||||||
|
if err != nil {
|
||||||
|
return xml.Name{}, nil, err
|
||||||
|
}
|
||||||
|
|
||||||
|
// Put it in an interface and allocate one.
|
||||||
|
var nv interface{}
|
||||||
|
switch se.Name.Space + " " + se.Name.Local {
|
||||||
|
case nsStream + " features":
|
||||||
|
nv = &streamFeatures{}
|
||||||
|
case nsStream + " error":
|
||||||
|
nv = &streamError{}
|
||||||
|
case nsTLS + " starttls":
|
||||||
|
nv = &tlsStartTLS{}
|
||||||
|
case nsTLS + " proceed":
|
||||||
|
nv = &tlsProceed{}
|
||||||
|
case nsTLS + " failure":
|
||||||
|
nv = &tlsFailure{}
|
||||||
|
case nsSASL + " mechanisms":
|
||||||
|
nv = &saslMechanisms{}
|
||||||
|
case nsSASL + " challenge":
|
||||||
|
nv = ""
|
||||||
|
case nsSASL + " response":
|
||||||
|
nv = ""
|
||||||
|
case nsSASL + " abort":
|
||||||
|
nv = &saslAbort{}
|
||||||
|
case nsSASL + " success":
|
||||||
|
nv = &saslSuccess{}
|
||||||
|
case nsSASL + " failure":
|
||||||
|
nv = &saslFailure{}
|
||||||
|
case nsBind + " bind":
|
||||||
|
nv = &bindBind{}
|
||||||
|
case nsClient + " message":
|
||||||
|
nv = &clientMessage{}
|
||||||
|
case nsClient + " presence":
|
||||||
|
nv = &clientPresence{}
|
||||||
|
case nsClient + " iq":
|
||||||
|
nv = &clientIQ{}
|
||||||
|
case nsClient + " error":
|
||||||
|
nv = &clientError{}
|
||||||
|
default:
|
||||||
|
return xml.Name{}, nil, errors.New("unexpected XMPP message " +
|
||||||
|
se.Name.Space + " <" + se.Name.Local + "/>")
|
||||||
|
}
|
||||||
|
|
||||||
|
// Unmarshal into that storage.
|
||||||
|
if err = p.DecodeElement(nv, &se); err != nil {
|
||||||
|
return xml.Name{}, nil, err
|
||||||
|
}
|
||||||
|
|
||||||
|
return se.Name, nv, err
|
||||||
|
}
|
||||||
|
|
||||||
|
func xmlEscape(s string) string {
|
||||||
|
var b bytes.Buffer
|
||||||
|
xml.Escape(&b, []byte(s))
|
||||||
|
|
||||||
|
return b.String()
|
||||||
|
}
|
||||||
|
|
||||||
|
type tee struct {
|
||||||
|
r io.Reader
|
||||||
|
w io.Writer
|
||||||
|
}
|
||||||
|
|
||||||
|
func (t tee) Read(p []byte) (n int, err error) {
|
||||||
|
n, err = t.r.Read(p)
|
||||||
|
if n > 0 {
|
||||||
|
t.w.Write(p[0:n])
|
||||||
|
t.w.Write([]byte("\n"))
|
||||||
|
}
|
||||||
|
return
|
||||||
|
}
|
31
vendor/github.com/mattn/go-xmpp/xmpp_information_query.go
generated
vendored
Normal file
31
vendor/github.com/mattn/go-xmpp/xmpp_information_query.go
generated
vendored
Normal file
|
@ -0,0 +1,31 @@
|
||||||
|
package xmpp
|
||||||
|
|
||||||
|
import (
|
||||||
|
"fmt"
|
||||||
|
"strconv"
|
||||||
|
)
|
||||||
|
|
||||||
|
const IQTypeGet = "get"
|
||||||
|
const IQTypeSet = "set"
|
||||||
|
const IQTypeResult = "result"
|
||||||
|
|
||||||
|
func (c *Client) Discovery() (string, error) {
|
||||||
|
const namespace = "http://jabber.org/protocol/disco#items"
|
||||||
|
// use getCookie for a pseudo random id.
|
||||||
|
reqID := strconv.FormatUint(uint64(getCookie()), 10)
|
||||||
|
return c.RawInformationQuery(c.jid, c.domain, reqID, IQTypeGet, namespace, "")
|
||||||
|
}
|
||||||
|
|
||||||
|
// RawInformationQuery sends an information query request to the server.
|
||||||
|
func (c *Client) RawInformationQuery(from, to, id, iqType, requestNamespace, body string) (string, error) {
|
||||||
|
const xmlIQ = "<iq from='%s' to='%s' id='%s' type='%s'><query xmlns='%s'>%s</query></iq>"
|
||||||
|
_, err := fmt.Fprintf(c.conn, xmlIQ, xmlEscape(from), xmlEscape(to), id, iqType, requestNamespace, body)
|
||||||
|
return id, err
|
||||||
|
}
|
||||||
|
|
||||||
|
// rawInformation send a IQ request with the the payload body to the server
|
||||||
|
func (c *Client) RawInformation(from, to, id, iqType, body string) (string, error) {
|
||||||
|
const xmlIQ = "<iq from='%s' to='%s' id='%s' type='%s'>%s</iq>"
|
||||||
|
_, err := fmt.Fprintf(c.conn, xmlIQ, xmlEscape(from), xmlEscape(to), id, iqType, body)
|
||||||
|
return id, err
|
||||||
|
}
|
135
vendor/github.com/mattn/go-xmpp/xmpp_muc.go
generated
vendored
Normal file
135
vendor/github.com/mattn/go-xmpp/xmpp_muc.go
generated
vendored
Normal file
|
@ -0,0 +1,135 @@
|
||||||
|
// Copyright 2013 Flo Lauber <dev@qatfy.at>. All rights reserved.
|
||||||
|
// Use of this source code is governed by a BSD-style
|
||||||
|
// license that can be found in the LICENSE file.
|
||||||
|
|
||||||
|
// TODO(flo):
|
||||||
|
// - support password protected MUC rooms
|
||||||
|
// - cleanup signatures of join/leave functions
|
||||||
|
package xmpp
|
||||||
|
|
||||||
|
import (
|
||||||
|
"errors"
|
||||||
|
"fmt"
|
||||||
|
"time"
|
||||||
|
)
|
||||||
|
|
||||||
|
const (
|
||||||
|
nsMUC = "http://jabber.org/protocol/muc"
|
||||||
|
nsMUCUser = "http://jabber.org/protocol/muc#user"
|
||||||
|
NoHistory = 0
|
||||||
|
CharHistory = 1
|
||||||
|
StanzaHistory = 2
|
||||||
|
SecondsHistory = 3
|
||||||
|
SinceHistory = 4
|
||||||
|
)
|
||||||
|
|
||||||
|
// Send sends room topic wrapped inside an XMPP message stanza body.
|
||||||
|
func (c *Client) SendTopic(chat Chat) (n int, err error) {
|
||||||
|
return fmt.Fprintf(c.conn, "<message to='%s' type='%s' xml:lang='en'>"+"<subject>%s</subject></message>",
|
||||||
|
xmlEscape(chat.Remote), xmlEscape(chat.Type), xmlEscape(chat.Text))
|
||||||
|
}
|
||||||
|
|
||||||
|
func (c *Client) JoinMUCNoHistory(jid, nick string) (n int, err error) {
|
||||||
|
if nick == "" {
|
||||||
|
nick = c.jid
|
||||||
|
}
|
||||||
|
return fmt.Fprintf(c.conn, "<presence to='%s/%s'>\n"+
|
||||||
|
"<x xmlns='%s'>"+
|
||||||
|
"<history maxchars='0'/></x>\n"+
|
||||||
|
"</presence>",
|
||||||
|
xmlEscape(jid), xmlEscape(nick), nsMUC)
|
||||||
|
}
|
||||||
|
|
||||||
|
// xep-0045 7.2
|
||||||
|
func (c *Client) JoinMUC(jid, nick string, history_type, history int, history_date *time.Time) (n int, err error) {
|
||||||
|
if nick == "" {
|
||||||
|
nick = c.jid
|
||||||
|
}
|
||||||
|
switch history_type {
|
||||||
|
case NoHistory:
|
||||||
|
return fmt.Fprintf(c.conn, "<presence to='%s/%s'>\n"+
|
||||||
|
"<x xmlns='%s' />\n"+
|
||||||
|
"</presence>",
|
||||||
|
xmlEscape(jid), xmlEscape(nick), nsMUC)
|
||||||
|
case CharHistory:
|
||||||
|
return fmt.Fprintf(c.conn, "<presence to='%s/%s'>\n"+
|
||||||
|
"<x xmlns='%s'>\n"+
|
||||||
|
"<history maxchars='%d'/></x>\n"+
|
||||||
|
"</presence>",
|
||||||
|
xmlEscape(jid), xmlEscape(nick), nsMUC, history)
|
||||||
|
case StanzaHistory:
|
||||||
|
return fmt.Fprintf(c.conn, "<presence to='%s/%s'>\n"+
|
||||||
|
"<x xmlns='%s'>\n"+
|
||||||
|
"<history maxstanzas='%d'/></x>\n"+
|
||||||
|
"</presence>",
|
||||||
|
xmlEscape(jid), xmlEscape(nick), nsMUC, history)
|
||||||
|
case SecondsHistory:
|
||||||
|
return fmt.Fprintf(c.conn, "<presence to='%s/%s'>\n"+
|
||||||
|
"<x xmlns='%s'>\n"+
|
||||||
|
"<history seconds='%d'/></x>\n"+
|
||||||
|
"</presence>",
|
||||||
|
xmlEscape(jid), xmlEscape(nick), nsMUC, history)
|
||||||
|
case SinceHistory:
|
||||||
|
if history_date != nil {
|
||||||
|
return fmt.Fprintf(c.conn, "<presence to='%s/%s'>\n"+
|
||||||
|
"<x xmlns='%s'>\n"+
|
||||||
|
"<history since='%s'/></x>\n"+
|
||||||
|
"</presence>",
|
||||||
|
xmlEscape(jid), xmlEscape(nick), nsMUC, history_date.Format(time.RFC3339))
|
||||||
|
}
|
||||||
|
}
|
||||||
|
return 0, errors.New("Unknown history option")
|
||||||
|
}
|
||||||
|
|
||||||
|
// xep-0045 7.2.6
|
||||||
|
func (c *Client) JoinProtectedMUC(jid, nick string, password string, history_type, history int, history_date *time.Time) (n int, err error) {
|
||||||
|
if nick == "" {
|
||||||
|
nick = c.jid
|
||||||
|
}
|
||||||
|
switch history_type {
|
||||||
|
case NoHistory:
|
||||||
|
return fmt.Fprintf(c.conn, "<presence to='%s/%s'>\n"+
|
||||||
|
"<x xmlns='%s'>\n"+
|
||||||
|
"<password>%s</password>"+
|
||||||
|
"</x>\n"+
|
||||||
|
"</presence>",
|
||||||
|
xmlEscape(jid), xmlEscape(nick), nsMUC, xmlEscape(password))
|
||||||
|
case CharHistory:
|
||||||
|
return fmt.Fprintf(c.conn, "<presence to='%s/%s'>\n"+
|
||||||
|
"<x xmlns='%s'>\n"+
|
||||||
|
"<password>%s</password>\n"+
|
||||||
|
"<history maxchars='%d'/></x>\n"+
|
||||||
|
"</presence>",
|
||||||
|
xmlEscape(jid), xmlEscape(nick), nsMUC, xmlEscape(password), history)
|
||||||
|
case StanzaHistory:
|
||||||
|
return fmt.Fprintf(c.conn, "<presence to='%s/%s'>\n"+
|
||||||
|
"<x xmlns='%s'>\n"+
|
||||||
|
"<password>%s</password>\n"+
|
||||||
|
"<history maxstanzas='%d'/></x>\n"+
|
||||||
|
"</presence>",
|
||||||
|
xmlEscape(jid), xmlEscape(nick), nsMUC, xmlEscape(password), history)
|
||||||
|
case SecondsHistory:
|
||||||
|
return fmt.Fprintf(c.conn, "<presence to='%s/%s'>\n"+
|
||||||
|
"<x xmlns='%s'>\n"+
|
||||||
|
"<password>%s</password>\n"+
|
||||||
|
"<history seconds='%d'/></x>\n"+
|
||||||
|
"</presence>",
|
||||||
|
xmlEscape(jid), xmlEscape(nick), nsMUC, xmlEscape(password), history)
|
||||||
|
case SinceHistory:
|
||||||
|
if history_date != nil {
|
||||||
|
return fmt.Fprintf(c.conn, "<presence to='%s/%s'>\n"+
|
||||||
|
"<x xmlns='%s'>\n"+
|
||||||
|
"<password>%s</password>\n"+
|
||||||
|
"<history since='%s'/></x>\n"+
|
||||||
|
"</presence>",
|
||||||
|
xmlEscape(jid), xmlEscape(nick), nsMUC, xmlEscape(password), history_date.Format(time.RFC3339))
|
||||||
|
}
|
||||||
|
}
|
||||||
|
return 0, errors.New("Unknown history option")
|
||||||
|
}
|
||||||
|
|
||||||
|
// xep-0045 7.14
|
||||||
|
func (c *Client) LeaveMUC(jid string) (n int, err error) {
|
||||||
|
return fmt.Fprintf(c.conn, "<presence from='%s' to='%s' type='unavailable' />",
|
||||||
|
c.jid, xmlEscape(jid))
|
||||||
|
}
|
33
vendor/github.com/mattn/go-xmpp/xmpp_ping.go
generated
vendored
Normal file
33
vendor/github.com/mattn/go-xmpp/xmpp_ping.go
generated
vendored
Normal file
|
@ -0,0 +1,33 @@
|
||||||
|
package xmpp
|
||||||
|
|
||||||
|
import (
|
||||||
|
"fmt"
|
||||||
|
)
|
||||||
|
|
||||||
|
func (c *Client) PingC2S(jid, server string) error {
|
||||||
|
if jid == "" {
|
||||||
|
jid = c.jid
|
||||||
|
}
|
||||||
|
if server == "" {
|
||||||
|
server = c.domain
|
||||||
|
}
|
||||||
|
_, err := fmt.Fprintf(c.conn, "<iq from='%s' to='%s' id='c2s1' type='get'>\n"+
|
||||||
|
"<ping xmlns='urn:xmpp:ping'/>\n"+
|
||||||
|
"</iq>",
|
||||||
|
xmlEscape(jid), xmlEscape(server))
|
||||||
|
return err
|
||||||
|
}
|
||||||
|
|
||||||
|
func (c *Client) PingS2S(fromServer, toServer string) error {
|
||||||
|
_, err := fmt.Fprintf(c.conn, "<iq from='%s' to='%s' id='s2s1' type='get'>\n"+
|
||||||
|
"<ping xmlns='urn:xmpp:ping'/>\n"+
|
||||||
|
"</iq>",
|
||||||
|
xmlEscape(fromServer), xmlEscape(toServer))
|
||||||
|
return err
|
||||||
|
}
|
||||||
|
|
||||||
|
func (c *Client) SendResultPing(id, toServer string) error {
|
||||||
|
_, err := fmt.Fprintf(c.conn, "<iq type='result' to='%s' id='%s'/>",
|
||||||
|
xmlEscape(toServer), xmlEscape(id))
|
||||||
|
return err
|
||||||
|
}
|
20
vendor/github.com/mattn/go-xmpp/xmpp_subscription.go
generated
vendored
Normal file
20
vendor/github.com/mattn/go-xmpp/xmpp_subscription.go
generated
vendored
Normal file
|
@ -0,0 +1,20 @@
|
||||||
|
package xmpp
|
||||||
|
|
||||||
|
import (
|
||||||
|
"fmt"
|
||||||
|
)
|
||||||
|
|
||||||
|
func (c *Client) ApproveSubscription(jid string) {
|
||||||
|
fmt.Fprintf(c.conn, "<presence to='%s' type='subscribed'/>",
|
||||||
|
xmlEscape(jid))
|
||||||
|
}
|
||||||
|
|
||||||
|
func (c *Client) RevokeSubscription(jid string) {
|
||||||
|
fmt.Fprintf(c.conn, "<presence to='%s' type='unsubscribed'/>",
|
||||||
|
xmlEscape(jid))
|
||||||
|
}
|
||||||
|
|
||||||
|
func (c *Client) RequestSubscription(jid string) {
|
||||||
|
fmt.Fprintf(c.conn, "<presence to='%s' type='subscribe'/>",
|
||||||
|
xmlEscape(jid))
|
||||||
|
}
|
21
vendor/github.com/mmcdole/gofeed/LICENSE
generated
vendored
Normal file
21
vendor/github.com/mmcdole/gofeed/LICENSE
generated
vendored
Normal file
|
@ -0,0 +1,21 @@
|
||||||
|
The MIT License (MIT)
|
||||||
|
|
||||||
|
Copyright (c) 2016 mmcdole
|
||||||
|
|
||||||
|
Permission is hereby granted, free of charge, to any person obtaining a copy
|
||||||
|
of this software and associated documentation files (the "Software"), to deal
|
||||||
|
in the Software without restriction, including without limitation the rights
|
||||||
|
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
|
||||||
|
copies of the Software, and to permit persons to whom the Software is
|
||||||
|
furnished to do so, subject to the following conditions:
|
||||||
|
|
||||||
|
The above copyright notice and this permission notice shall be included in all
|
||||||
|
copies or substantial portions of the Software.
|
||||||
|
|
||||||
|
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
|
||||||
|
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
|
||||||
|
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
|
||||||
|
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
|
||||||
|
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
|
||||||
|
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
|
||||||
|
SOFTWARE.
|
255
vendor/github.com/mmcdole/gofeed/README.md
generated
vendored
Normal file
255
vendor/github.com/mmcdole/gofeed/README.md
generated
vendored
Normal file
|
@ -0,0 +1,255 @@
|
||||||
|
# gofeed
|
||||||
|
|
||||||
|
[![Build Status](https://travis-ci.org/mmcdole/gofeed.svg?branch=master)](https://travis-ci.org/mmcdole/gofeed) [![Coverage Status](https://coveralls.io/repos/github/mmcdole/gofeed/badge.svg?branch=master)](https://coveralls.io/github/mmcdole/gofeed?branch=master) [![Go Report Card](https://goreportcard.com/badge/github.com/mmcdole/gofeed)](https://goreportcard.com/report/github.com/mmcdole/gofeed) [![](https://godoc.org/github.com/mmcdole/gofeed?status.svg)](http://godoc.org/github.com/mmcdole/gofeed) [![License](http://img.shields.io/:license-mit-blue.svg)](http://doge.mit-license.org)
|
||||||
|
|
||||||
|
The `gofeed` library is a robust feed parser that supports parsing both [RSS](https://en.wikipedia.org/wiki/RSS) and [Atom](https://en.wikipedia.org/wiki/Atom_(standard)) feeds. The library provides a universal `gofeed.Parser` that will parse and convert all feed types into a hybrid `gofeed.Feed` model. You also have the option of utilizing the feed specific `atom.Parser` or `rss.Parser` parsers which generate `atom.Feed` and `rss.Feed` respectively.
|
||||||
|
|
||||||
|
## Table of Contents
|
||||||
|
- [Features](#features)
|
||||||
|
- [Overview](#overview)
|
||||||
|
- [Basic Usage](#basic-usage)
|
||||||
|
- [Advanced Usage](#advanced-usage)
|
||||||
|
- [Extensions](#extensions)
|
||||||
|
- [Invalid Feeds](#invalid-feeds)
|
||||||
|
- [Default Mappings](#default-mappings)
|
||||||
|
- [Dependencies](#dependencies)
|
||||||
|
- [License](#license)
|
||||||
|
- [Credits](#credits)
|
||||||
|
|
||||||
|
## Features
|
||||||
|
|
||||||
|
#### Supported feed types:
|
||||||
|
* RSS 0.90
|
||||||
|
* Netscape RSS 0.91
|
||||||
|
* Userland RSS 0.91
|
||||||
|
* RSS 0.92
|
||||||
|
* RSS 0.93
|
||||||
|
* RSS 0.94
|
||||||
|
* RSS 1.0
|
||||||
|
* RSS 2.0
|
||||||
|
* Atom 0.3
|
||||||
|
* Atom 1.0
|
||||||
|
|
||||||
|
#### Extension Support
|
||||||
|
|
||||||
|
The `gofeed` library provides support for parsing several popular predefined extensions into ready-made structs, including [Dublin Core](http://dublincore.org/documents/dces/) and [Apple’s iTunes](https://help.apple.com/itc/podcasts_connect/#/itcb54353390).
|
||||||
|
|
||||||
|
It parses all other feed extensions in a generic way (see the [Extensions](#extensions) section for more details).
|
||||||
|
|
||||||
|
#### Invalid Feeds
|
||||||
|
|
||||||
|
A best-effort attempt is made at parsing broken and invalid XML feeds. Currently, `gofeed` can succesfully parse feeds with the following issues:
|
||||||
|
- Unescaped/Naked Markup in feed elements
|
||||||
|
- Undeclared namespace prefixes
|
||||||
|
- Missing closing tags on certain elements
|
||||||
|
- Illegal tags within feed elements without namespace prefixes
|
||||||
|
- Missing "required" elements as specified by the respective feed specs.
|
||||||
|
- Incorrect date formats
|
||||||
|
|
||||||
|
## Overview
|
||||||
|
|
||||||
|
The `gofeed` library is comprised of a universal feed parser and several feed specific parsers. Which one you choose depends entirely on your usecase. If you will be handling both rss and atom feeds then it makes sense to use the `gofeed.Parser`. If you know ahead of time that you will only be parsing one feed type then it would make sense to use `rss.Parser` or `atom.Parser`.
|
||||||
|
|
||||||
|
#### Universal Feed Parser
|
||||||
|
|
||||||
|
The universal `gofeed.Parser` works in 3 stages: detection, parsing and translation. It first detects the feed type that it is currently parsing. Then it uses a feed specific parser to parse the feed into its true representation which will be either a `rss.Feed` or `atom.Feed`. These models cover every field possible for their respective feed types. Finally, they are *translated* into a `gofeed.Feed` model that is a hybrid of both feed types. Performing the universal feed parsing in these 3 stages allows for more flexibility and keeps the code base more maintainable by separating RSS and Atom parsing into seperate packages.
|
||||||
|
|
||||||
|
![Diagram](docs/sequence.png)
|
||||||
|
|
||||||
|
The translation step is done by anything which adheres to the `gofeed.Translator` interface. The `DefaultRSSTranslator` and `DefaultAtomTranslator` are used behind the scenes when you use the `gofeed.Parser` with its default settings. You can see how they translate fields from ```atom.Feed``` or ```rss.Feed``` to the universal ```gofeed.Feed``` struct in the [Default Mappings](#default-mappings) section. However, should you disagree with the way certain fields are translated you can easily supply your own `gofeed.Translator` and override this behavior. See the [Advanced Usage](#advanced-usage) section for an example how to do this.
|
||||||
|
|
||||||
|
#### Feed Specific Parsers
|
||||||
|
|
||||||
|
The `gofeed` library provides two feed specific parsers: `atom.Parser` and `rss.Parser`. If the hybrid `gofeed.Feed` model that the universal `gofeed.Parser` produces does not contain a field from the `atom.Feed` or `rss.Feed` model that you require, it might be beneficial to use the feed specific parsers. When using the `atom.Parser` or `rss.Parser` directly, you can access all of fields found in the `atom.Feed` and `rss.Feed` models. It is also marginally faster because you are able to skip the translation step.
|
||||||
|
|
||||||
|
## Basic Usage
|
||||||
|
|
||||||
|
#### Universal Feed Parser
|
||||||
|
|
||||||
|
The most common usage scenario will be to use ```gofeed.Parser``` to parse an arbitrary RSS or Atom feed into the hybrid ```gofeed.Feed``` model. This hybrid model allows you to treat RSS and Atom feeds the same.
|
||||||
|
|
||||||
|
##### Parse a feed from an URL:
|
||||||
|
|
||||||
|
```go
|
||||||
|
fp := gofeed.NewParser()
|
||||||
|
feed, _ := fp.ParseURL("http://feeds.twit.tv/twit.xml")
|
||||||
|
fmt.Println(feed.Title)
|
||||||
|
```
|
||||||
|
|
||||||
|
##### Parse a feed from a string:
|
||||||
|
|
||||||
|
```go
|
||||||
|
feedData := `<rss version="2.0">
|
||||||
|
<channel>
|
||||||
|
<title>Sample Feed</title>
|
||||||
|
</channel>
|
||||||
|
</rss>`
|
||||||
|
fp := gofeed.NewParser()
|
||||||
|
feed, _ := fp.ParseString(feedData)
|
||||||
|
fmt.Println(feed.Title)
|
||||||
|
```
|
||||||
|
|
||||||
|
##### Parse a feed from an io.Reader:
|
||||||
|
|
||||||
|
```go
|
||||||
|
file, _ := os.Open("/path/to/a/file.xml")
|
||||||
|
defer file.Close()
|
||||||
|
fp := gofeed.NewParser()
|
||||||
|
feed, _ := fp.Parse(file)
|
||||||
|
fmt.Println(feed.Title)
|
||||||
|
```
|
||||||
|
|
||||||
|
#### Feed Specific Parsers
|
||||||
|
|
||||||
|
You can easily use the `rss.Parser` and `atom.Parser` directly if you have a usage scenario that requires it:
|
||||||
|
|
||||||
|
##### Parse a RSS feed into a `rss.Feed`
|
||||||
|
|
||||||
|
```go
|
||||||
|
feedData := `<rss version="2.0">
|
||||||
|
<channel>
|
||||||
|
<webMaster>example@site.com (Example Name)</webMaster>
|
||||||
|
</channel>
|
||||||
|
</rss>`
|
||||||
|
fp := rss.Parser{}
|
||||||
|
rssFeed, _ := fp.Parse(strings.NewReader(feedData))
|
||||||
|
fmt.Println(rssFeed.WebMaster)
|
||||||
|
```
|
||||||
|
|
||||||
|
##### Parse an Atom feed into a `atom.Feed`
|
||||||
|
|
||||||
|
```go
|
||||||
|
feedData := `<feed xmlns="http://www.w3.org/2005/Atom">
|
||||||
|
<subtitle>Example Atom</subtitle>
|
||||||
|
</feed>`
|
||||||
|
fp := atom.Parser{}
|
||||||
|
atomFeed, _ := fp.Parse(strings.NewReader(feedData))
|
||||||
|
fmt.Println(atomFeed.Subtitle)
|
||||||
|
```
|
||||||
|
|
||||||
|
## Advanced Usage
|
||||||
|
|
||||||
|
##### Parse a feed while using a custom translator
|
||||||
|
|
||||||
|
The mappings and precedence order that are outlined in the [Default Mappings](#default-mappings) section are provided by the following two structs: `DefaultRSSTranslator` and `DefaultAtomTranslator`. If you have fields that you think should have a different precedence, or if you want to make a translator that is aware of an unsupported extension you can do this by specifying your own RSS or Atom translator when using the `gofeed.Parser`.
|
||||||
|
|
||||||
|
Here is a simple example of creating a custom `Translator` that makes the `/rss/channel/itunes:author` field have a higher precedence than the `/rss/channel/managingEditor` field in RSS feeds. We will wrap the existing `DefaultRSSTranslator` since we only want to change the behavior for a single field.
|
||||||
|
|
||||||
|
First we must define a custom translator:
|
||||||
|
|
||||||
|
```go
|
||||||
|
|
||||||
|
import (
|
||||||
|
"fmt"
|
||||||
|
|
||||||
|
"github.com/mmcdole/gofeed"
|
||||||
|
"github.com/mmcdole/gofeed/rss"
|
||||||
|
)
|
||||||
|
|
||||||
|
type MyCustomTranslator struct {
|
||||||
|
defaultTranslator *gofeed.DefaultRSSTranslator
|
||||||
|
}
|
||||||
|
|
||||||
|
func NewMyCustomTranslator() *MyCustomTranslator {
|
||||||
|
t := &MyCustomTranslator{}
|
||||||
|
|
||||||
|
// We create a DefaultRSSTranslator internally so we can wrap its Translate
|
||||||
|
// call since we only want to modify the precedence for a single field.
|
||||||
|
t.defaultTranslator = &gofeed.DefaultRSSTranslator{}
|
||||||
|
return t
|
||||||
|
}
|
||||||
|
|
||||||
|
func (ct* MyCustomTranslator) Translate(feed interface{}) (*gofeed.Feed, error) {
|
||||||
|
rss, found := feed.(*rss.Feed)
|
||||||
|
if !found {
|
||||||
|
return nil, fmt.Errorf("Feed did not match expected type of *rss.Feed")
|
||||||
|
}
|
||||||
|
|
||||||
|
f, err := ct.defaultTranslator.Translate(rss)
|
||||||
|
if err != nil {
|
||||||
|
return nil, err
|
||||||
|
}
|
||||||
|
|
||||||
|
if rss.ITunesExt != nil && rss.ITunesExt.Author != "" {
|
||||||
|
f.Author = rss.ITunesExt.Author
|
||||||
|
} else {
|
||||||
|
f.Author = rss.ManagingEditor
|
||||||
|
}
|
||||||
|
return f
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
Next you must configure your `gofeed.Parser` to utilize the new `gofeed.Translator`:
|
||||||
|
|
||||||
|
```go
|
||||||
|
feedData := `<rss version="2.0">
|
||||||
|
<channel>
|
||||||
|
<managingEditor>Ender Wiggin</managingEditor>
|
||||||
|
<itunes:author>Valentine Wiggin</itunes:author>
|
||||||
|
</channel>
|
||||||
|
</rss>`
|
||||||
|
|
||||||
|
fp := gofeed.NewParser()
|
||||||
|
fp.RSSTranslator = NewMyCustomTranslator()
|
||||||
|
feed, _ := fp.ParseString(feedData)
|
||||||
|
fmt.Println(feed.Author) // Valentine Wiggin
|
||||||
|
```
|
||||||
|
|
||||||
|
## Extensions
|
||||||
|
|
||||||
|
Every element which does not belong to the feed's default namespace is considered an extension by `gofeed`. These are parsed and stored in a tree-like structure located at `Feed.Extensions` and `Item.Extensions`. These fields should allow you to access and read any custom extension elements.
|
||||||
|
|
||||||
|
In addition to the generic handling of extensions, `gofeed` also has built in support for parsing certain popular extensions into their own structs for convenience. It currently supports the [Dublin Core](http://dublincore.org/documents/dces/) and [Apple iTunes](https://help.apple.com/itc/podcasts_connect/#/itcb54353390) extensions which you can access at `Feed.ItunesExt`, `feed.DublinCoreExt` and `Item.ITunesExt` and `Item.DublinCoreExt`
|
||||||
|
|
||||||
|
## Default Mappings
|
||||||
|
|
||||||
|
The ```DefaultRSSTranslator``` and the ```DefaultAtomTranslator``` map the following ```rss.Feed``` and ```atom.Feed``` fields to their respective ```gofeed.Feed``` fields. They are listed in order of precedence (highest to lowest):
|
||||||
|
|
||||||
|
|
||||||
|
`gofeed.Feed` | RSS | Atom
|
||||||
|
--- | --- | ---
|
||||||
|
Title | /rss/channel/title<br>/rdf:RDF/channel/title<br>/rss/channel/dc:title<br>/rdf:RDF/channel/dc:title | /feed/title
|
||||||
|
Description | /rss/channel/description<br>/rdf:RDF/channel/description<br>/rss/channel/itunes:subtitle | /feed/subtitle<br>/feed/tagline
|
||||||
|
Link | /rss/channel/link<br>/rdf:RDF/channel/link | /feed/link[@rel=”alternate”]/@href<br>/feed/link[not(@rel)]/@href
|
||||||
|
FeedLink | /rss/channel/atom:link[@rel="self"]/@href<br>/rdf:RDF/channel/atom:link[@rel="self"]/@href | /feed/link[@rel="self"]/@href
|
||||||
|
Updated | /rss/channel/lastBuildDate<br>/rss/channel/dc:date<br>/rdf:RDF/channel/dc:date | /feed/updated<br>/feed/modified
|
||||||
|
Published | /rss/channel/pubDate |
|
||||||
|
Author | /rss/channel/managingEditor<br>/rss/channel/webMaster<br>/rss/channel/dc:author<br>/rdf:RDF/channel/dc:author<br>/rss/channel/dc:creator<br>/rdf:RDF/channel/dc:creator<br>/rss/channel/itunes:author | /feed/author
|
||||||
|
Language | /rss/channel/language<br>/rss/channel/dc:language<br>/rdf:RDF/channel/dc:language | /feed/@xml:lang
|
||||||
|
Image | /rss/channel/image<br>/rdf:RDF/image<br>/rss/channel/itunes:image | /feed/logo
|
||||||
|
Copyright | /rss/channel/copyright<br>/rss/channel/dc:rights<br>/rdf:RDF/channel/dc:rights | /feed/rights<br>/feed/copyright
|
||||||
|
Generator | /rss/channel/generator | /feed/generator
|
||||||
|
Categories | /rss/channel/category<br>/rss/channel/itunes:category<br>/rss/channel/itunes:keywords<br>/rss/channel/dc:subject<br>/rdf:RDF/channel/dc:subject | /feed/category
|
||||||
|
|
||||||
|
|
||||||
|
`gofeed.Item` | RSS | Atom
|
||||||
|
--- | --- | ---
|
||||||
|
Title | /rss/channel/item/title<br>/rdf:RDF/item/title<br>/rdf:RDF/item/dc:title<br>/rss/channel/item/dc:title | /feed/entry/title
|
||||||
|
Description | /rss/channel/item/description<br>/rdf:RDF/item/description<br>/rss/channel/item/dc:description<br>/rdf:RDF/item/dc:description | /feed/entry/summary
|
||||||
|
Content | /rss/channel/item/content:encoded | /feed/entry/content
|
||||||
|
Link | /rss/channel/item/link<br>/rdf:RDF/item/link | /feed/entry/link[@rel=”alternate”]/@href<br>/feed/entry/link[not(@rel)]/@href
|
||||||
|
Updated | /rss/channel/item/dc:date<br>/rdf:RDF/rdf:item/dc:date | /feed/entry/modified<br>/feed/entry/updated
|
||||||
|
Published | /rss/channel/item/pubDate<br>/rss/channel/item/dc:date | /feed/entry/published<br>/feed/entry/issued
|
||||||
|
Author | /rss/channel/item/author<br>/rss/channel/item/dc:author<br>/rdf:RDF/item/dc:author<br>/rss/channel/item/dc:creator<br>/rdf:RDF/item/dc:creator<br>/rss/channel/item/itunes:author | /feed/entry/author
|
||||||
|
GUID | /rss/channel/item/guid | /feed/entry/id
|
||||||
|
Image | /rss/channel/item/itunes:image<br>/rss/channel/item/media:image |
|
||||||
|
Categories | /rss/channel/item/category<br>/rss/channel/item/dc:subject<br>/rss/channel/item/itunes:keywords<br>/rdf:RDF/channel/item/dc:subject | /feed/entry/category
|
||||||
|
Enclosures | /rss/channel/item/enclosure | /feed/entry/link[@rel=”enclosure”]
|
||||||
|
|
||||||
|
## Dependencies
|
||||||
|
|
||||||
|
* [goxpp](https://github.com/mmcdole/goxpp) - XML Pull Parser
|
||||||
|
* [goquery](https://github.com/PuerkitoBio/goquery) - Go jQuery-like interface
|
||||||
|
* [testify](https://github.com/stretchr/testify) - Unit test enhancements
|
||||||
|
|
||||||
|
## License
|
||||||
|
|
||||||
|
This project is licensed under the [MIT License](https://raw.githubusercontent.com/mmcdole/gofeed/master/LICENSE)
|
||||||
|
|
||||||
|
## Credits
|
||||||
|
|
||||||
|
* [cristoper](https://github.com/cristoper) for his work on implementing xml:base relative URI handling.
|
||||||
|
* [Mark Pilgrim](https://en.wikipedia.org/wiki/Mark_Pilgrim) and [Kurt McKee](http://kurtmckee.org) for their work on the excellent [Universal Feed Parser](https://github.com/kurtmckee/feedparser) Python library. This library was the inspiration for the `gofeed` library.
|
||||||
|
* [Dan MacTough](http://blog.mact.me) for his work on [node-feedparser](https://github.com/danmactough/node-feedparser). It provided inspiration for the set of fields that should be covered in the hybrid `gofeed.Feed` model.
|
||||||
|
* [Matt Jibson](https://mattjibson.com/) for his date parsing function in the [goread](https://github.com/mjibson/goread) project.
|
||||||
|
* [Jim Teeuwen](https://github.com/jteeuwen) for his method of representing arbitrary feed extensions in the [go-pkg-rss](https://github.com/jteeuwen/go-pkg-rss) library.
|
114
vendor/github.com/mmcdole/gofeed/atom/feed.go
generated
vendored
Normal file
114
vendor/github.com/mmcdole/gofeed/atom/feed.go
generated
vendored
Normal file
|
@ -0,0 +1,114 @@
|
||||||
|
package atom
|
||||||
|
|
||||||
|
import (
|
||||||
|
"encoding/json"
|
||||||
|
"time"
|
||||||
|
|
||||||
|
"github.com/mmcdole/gofeed/extensions"
|
||||||
|
)
|
||||||
|
|
||||||
|
// Feed is an Atom Feed
|
||||||
|
type Feed struct {
|
||||||
|
Title string `json:"title,omitempty"`
|
||||||
|
ID string `json:"id,omitempty"`
|
||||||
|
Updated string `json:"updated,omitempty"`
|
||||||
|
UpdatedParsed *time.Time `json:"updatedParsed,omitempty"`
|
||||||
|
Subtitle string `json:"subtitle,omitempty"`
|
||||||
|
Links []*Link `json:"links,omitempty"`
|
||||||
|
Language string `json:"language,omitempty"`
|
||||||
|
Generator *Generator `json:"generator,omitempty"`
|
||||||
|
Icon string `json:"icon,omitempty"`
|
||||||
|
Logo string `json:"logo,omitempty"`
|
||||||
|
Rights string `json:"rights,omitempty"`
|
||||||
|
Contributors []*Person `json:"contributors,omitempty"`
|
||||||
|
Authors []*Person `json:"authors,omitempty"`
|
||||||
|
Categories []*Category `json:"categories,omitempty"`
|
||||||
|
Entries []*Entry `json:"entries"`
|
||||||
|
Extensions ext.Extensions `json:"extensions,omitempty"`
|
||||||
|
Version string `json:"version"`
|
||||||
|
}
|
||||||
|
|
||||||
|
func (f Feed) String() string {
|
||||||
|
json, _ := json.MarshalIndent(f, "", " ")
|
||||||
|
return string(json)
|
||||||
|
}
|
||||||
|
|
||||||
|
// Entry is an Atom Entry
|
||||||
|
type Entry struct {
|
||||||
|
Title string `json:"title,omitempty"`
|
||||||
|
ID string `json:"id,omitempty"`
|
||||||
|
Updated string `json:"updated,omitempty"`
|
||||||
|
UpdatedParsed *time.Time `json:"updatedParsed,omitempty"`
|
||||||
|
Summary string `json:"summary,omitempty"`
|
||||||
|
Authors []*Person `json:"authors,omitempty"`
|
||||||
|
Contributors []*Person `json:"contributors,omitempty"`
|
||||||
|
Categories []*Category `json:"categories,omitempty"`
|
||||||
|
Links []*Link `json:"links,omitempty"`
|
||||||
|
Rights string `json:"rights,omitempty"`
|
||||||
|
Published string `json:"published,omitempty"`
|
||||||
|
PublishedParsed *time.Time `json:"publishedParsed,omitempty"`
|
||||||
|
Source *Source `json:"source,omitempty"`
|
||||||
|
Content *Content `json:"content,omitempty"`
|
||||||
|
Extensions ext.Extensions `json:"extensions,omitempty"`
|
||||||
|
}
|
||||||
|
|
||||||
|
// Category is category metadata for Feeds and Entries
|
||||||
|
type Category struct {
|
||||||
|
Term string `json:"term,omitempty"`
|
||||||
|
Scheme string `json:"scheme,omitempty"`
|
||||||
|
Label string `json:"label,omitempty"`
|
||||||
|
}
|
||||||
|
|
||||||
|
// Person represents a person in an Atom feed
|
||||||
|
// for things like Authors, Contributors, etc
|
||||||
|
type Person struct {
|
||||||
|
Name string `json:"name,omitempty"`
|
||||||
|
Email string `json:"email,omitempty"`
|
||||||
|
URI string `json:"uri,omitempty"`
|
||||||
|
}
|
||||||
|
|
||||||
|
// Link is an Atom link that defines a reference
|
||||||
|
// from an entry or feed to a Web resource
|
||||||
|
type Link struct {
|
||||||
|
Href string `json:"href,omitempty"`
|
||||||
|
Hreflang string `json:"hreflang,omitempty"`
|
||||||
|
Rel string `json:"rel,omitempty"`
|
||||||
|
Type string `json:"type,omitempty"`
|
||||||
|
Title string `json:"title,omitempty"`
|
||||||
|
Length string `json:"length,omitempty"`
|
||||||
|
}
|
||||||
|
|
||||||
|
// Content either contains or links to the content of
|
||||||
|
// the entry
|
||||||
|
type Content struct {
|
||||||
|
Src string `json:"src,omitempty"`
|
||||||
|
Type string `json:"type,omitempty"`
|
||||||
|
Value string `json:"value,omitempty"`
|
||||||
|
}
|
||||||
|
|
||||||
|
// Generator identifies the agent used to generate a
|
||||||
|
// feed, for debugging and other purposes.
|
||||||
|
type Generator struct {
|
||||||
|
Value string `json:"value,omitempty"`
|
||||||
|
URI string `json:"uri,omitempty"`
|
||||||
|
Version string `json:"version,omitempty"`
|
||||||
|
}
|
||||||
|
|
||||||
|
// Source contains the feed information for another
|
||||||
|
// feed if a given entry came from that feed.
|
||||||
|
type Source struct {
|
||||||
|
Title string `json:"title,omitempty"`
|
||||||
|
ID string `json:"id,omitempty"`
|
||||||
|
Updated string `json:"updated,omitempty"`
|
||||||
|
UpdatedParsed *time.Time `json:"updatedParsed,omitempty"`
|
||||||
|
Subtitle string `json:"subtitle,omitempty"`
|
||||||
|
Links []*Link `json:"links,omitempty"`
|
||||||
|
Generator *Generator `json:"generator,omitempty"`
|
||||||
|
Icon string `json:"icon,omitempty"`
|
||||||
|
Logo string `json:"logo,omitempty"`
|
||||||
|
Rights string `json:"rights,omitempty"`
|
||||||
|
Contributors []*Person `json:"contributors,omitempty"`
|
||||||
|
Authors []*Person `json:"authors,omitempty"`
|
||||||
|
Categories []*Category `json:"categories,omitempty"`
|
||||||
|
Extensions ext.Extensions `json:"extensions,omitempty"`
|
||||||
|
}
|
45
vendor/github.com/mmcdole/gofeed/extensions/dublincore.go
generated
vendored
Normal file
45
vendor/github.com/mmcdole/gofeed/extensions/dublincore.go
generated
vendored
Normal file
|
@ -0,0 +1,45 @@
|
||||||
|
package ext
|
||||||
|
|
||||||
|
// DublinCoreExtension represents a feed extension
|
||||||
|
// for the Dublin Core specification.
|
||||||
|
type DublinCoreExtension struct {
|
||||||
|
Title []string `json:"title,omitempty"`
|
||||||
|
Creator []string `json:"creator,omitempty"`
|
||||||
|
Author []string `json:"author,omitempty"`
|
||||||
|
Subject []string `json:"subject,omitempty"`
|
||||||
|
Description []string `json:"description,omitempty"`
|
||||||
|
Publisher []string `json:"publisher,omitempty"`
|
||||||
|
Contributor []string `json:"contributor,omitempty"`
|
||||||
|
Date []string `json:"date,omitempty"`
|
||||||
|
Type []string `json:"type,omitempty"`
|
||||||
|
Format []string `json:"format,omitempty"`
|
||||||
|
Identifier []string `json:"identifier,omitempty"`
|
||||||
|
Source []string `json:"source,omitempty"`
|
||||||
|
Language []string `json:"language,omitempty"`
|
||||||
|
Relation []string `json:"relation,omitempty"`
|
||||||
|
Coverage []string `json:"coverage,omitempty"`
|
||||||
|
Rights []string `json:"rights,omitempty"`
|
||||||
|
}
|
||||||
|
|
||||||
|
// NewDublinCoreExtension creates a new DublinCoreExtension
|
||||||
|
// given the generic extension map for the "dc" prefix.
|
||||||
|
func NewDublinCoreExtension(extensions map[string][]Extension) *DublinCoreExtension {
|
||||||
|
dc := &DublinCoreExtension{}
|
||||||
|
dc.Title = parseTextArrayExtension("title", extensions)
|
||||||
|
dc.Creator = parseTextArrayExtension("creator", extensions)
|
||||||
|
dc.Author = parseTextArrayExtension("author", extensions)
|
||||||
|
dc.Subject = parseTextArrayExtension("subject", extensions)
|
||||||
|
dc.Description = parseTextArrayExtension("description", extensions)
|
||||||
|
dc.Publisher = parseTextArrayExtension("publisher", extensions)
|
||||||
|
dc.Contributor = parseTextArrayExtension("contributor", extensions)
|
||||||
|
dc.Date = parseTextArrayExtension("date", extensions)
|
||||||
|
dc.Type = parseTextArrayExtension("type", extensions)
|
||||||
|
dc.Format = parseTextArrayExtension("format", extensions)
|
||||||
|
dc.Identifier = parseTextArrayExtension("identifier", extensions)
|
||||||
|
dc.Source = parseTextArrayExtension("source", extensions)
|
||||||
|
dc.Language = parseTextArrayExtension("language", extensions)
|
||||||
|
dc.Relation = parseTextArrayExtension("relation", extensions)
|
||||||
|
dc.Coverage = parseTextArrayExtension("coverage", extensions)
|
||||||
|
dc.Rights = parseTextArrayExtension("rights", extensions)
|
||||||
|
return dc
|
||||||
|
}
|
46
vendor/github.com/mmcdole/gofeed/extensions/extensions.go
generated
vendored
Normal file
46
vendor/github.com/mmcdole/gofeed/extensions/extensions.go
generated
vendored
Normal file
|
@ -0,0 +1,46 @@
|
||||||
|
package ext
|
||||||
|
|
||||||
|
// Extensions is the generic extension map for Feeds and Items.
|
||||||
|
// The first map is for the element namespace prefix (e.g., itunes).
|
||||||
|
// The second map is for the element name (e.g., author).
|
||||||
|
type Extensions map[string]map[string][]Extension
|
||||||
|
|
||||||
|
// Extension represents a single XML element that was in a non
|
||||||
|
// default namespace in a Feed or Item/Entry.
|
||||||
|
type Extension struct {
|
||||||
|
Name string `json:"name"`
|
||||||
|
Value string `json:"value"`
|
||||||
|
Attrs map[string]string `json:"attrs"`
|
||||||
|
Children map[string][]Extension `json:"children"`
|
||||||
|
}
|
||||||
|
|
||||||
|
func parseTextExtension(name string, extensions map[string][]Extension) (value string) {
|
||||||
|
if extensions == nil {
|
||||||
|
return
|
||||||
|
}
|
||||||
|
|
||||||
|
matches, ok := extensions[name]
|
||||||
|
if !ok || len(matches) == 0 {
|
||||||
|
return
|
||||||
|
}
|
||||||
|
|
||||||
|
match := matches[0]
|
||||||
|
return match.Value
|
||||||
|
}
|
||||||
|
|
||||||
|
func parseTextArrayExtension(name string, extensions map[string][]Extension) (values []string) {
|
||||||
|
if extensions == nil {
|
||||||
|
return
|
||||||
|
}
|
||||||
|
|
||||||
|
matches, ok := extensions[name]
|
||||||
|
if !ok || len(matches) == 0 {
|
||||||
|
return
|
||||||
|
}
|
||||||
|
|
||||||
|
values = []string{}
|
||||||
|
for _, m := range matches {
|
||||||
|
values = append(values, m.Value)
|
||||||
|
}
|
||||||
|
return
|
||||||
|
}
|
144
vendor/github.com/mmcdole/gofeed/extensions/itunes.go
generated
vendored
Normal file
144
vendor/github.com/mmcdole/gofeed/extensions/itunes.go
generated
vendored
Normal file
|
@ -0,0 +1,144 @@
|
||||||
|
package ext
|
||||||
|
|
||||||
|
// ITunesFeedExtension is a set of extension
|
||||||
|
// fields for RSS feeds.
|
||||||
|
type ITunesFeedExtension struct {
|
||||||
|
Author string `json:"author,omitempty"`
|
||||||
|
Block string `json:"block,omitempty"`
|
||||||
|
Categories []*ITunesCategory `json:"categories,omitempty"`
|
||||||
|
Explicit string `json:"explicit,omitempty"`
|
||||||
|
Keywords string `json:"keywords,omitempty"`
|
||||||
|
Owner *ITunesOwner `json:"owner,omitempty"`
|
||||||
|
Subtitle string `json:"subtitle,omitempty"`
|
||||||
|
Summary string `json:"summary,omitempty"`
|
||||||
|
Image string `json:"image,omitempty"`
|
||||||
|
Complete string `json:"complete,omitempty"`
|
||||||
|
NewFeedURL string `json:"newFeedUrl,omitempty"`
|
||||||
|
Type string `json:"type,omitempty"`
|
||||||
|
}
|
||||||
|
|
||||||
|
// ITunesItemExtension is a set of extension
|
||||||
|
// fields for RSS items.
|
||||||
|
type ITunesItemExtension struct {
|
||||||
|
Author string `json:"author,omitempty"`
|
||||||
|
Block string `json:"block,omitempty"`
|
||||||
|
Duration string `json:"duration,omitempty"`
|
||||||
|
Explicit string `json:"explicit,omitempty"`
|
||||||
|
Keywords string `json:"keywords,omitempty"`
|
||||||
|
Subtitle string `json:"subtitle,omitempty"`
|
||||||
|
Summary string `json:"summary,omitempty"`
|
||||||
|
Image string `json:"image,omitempty"`
|
||||||
|
IsClosedCaptioned string `json:"isClosedCaptioned,omitempty"`
|
||||||
|
Order string `json:"order,omitempty"`
|
||||||
|
}
|
||||||
|
|
||||||
|
// ITunesCategory is a category element for itunes feeds.
|
||||||
|
type ITunesCategory struct {
|
||||||
|
Text string `json:"text,omitempty"`
|
||||||
|
Subcategory *ITunesCategory `json:"subcategory,omitempty"`
|
||||||
|
}
|
||||||
|
|
||||||
|
// ITunesOwner is the owner of a particular itunes feed.
|
||||||
|
type ITunesOwner struct {
|
||||||
|
Email string `json:"email,omitempty"`
|
||||||
|
Name string `json:"name,omitempty"`
|
||||||
|
}
|
||||||
|
|
||||||
|
// NewITunesFeedExtension creates an ITunesFeedExtension given an
|
||||||
|
// extension map for the "itunes" key.
|
||||||
|
func NewITunesFeedExtension(extensions map[string][]Extension) *ITunesFeedExtension {
|
||||||
|
feed := &ITunesFeedExtension{}
|
||||||
|
feed.Author = parseTextExtension("author", extensions)
|
||||||
|
feed.Block = parseTextExtension("block", extensions)
|
||||||
|
feed.Explicit = parseTextExtension("explicit", extensions)
|
||||||
|
feed.Keywords = parseTextExtension("keywords", extensions)
|
||||||
|
feed.Subtitle = parseTextExtension("subtitle", extensions)
|
||||||
|
feed.Summary = parseTextExtension("summary", extensions)
|
||||||
|
feed.Image = parseImage(extensions)
|
||||||
|
feed.Complete = parseTextExtension("complete", extensions)
|
||||||
|
feed.NewFeedURL = parseTextExtension("new-feed-url", extensions)
|
||||||
|
feed.Categories = parseCategories(extensions)
|
||||||
|
feed.Owner = parseOwner(extensions)
|
||||||
|
feed.Type = parseTextExtension("type", extensions)
|
||||||
|
return feed
|
||||||
|
}
|
||||||
|
|
||||||
|
// NewITunesItemExtension creates an ITunesItemExtension given an
|
||||||
|
// extension map for the "itunes" key.
|
||||||
|
func NewITunesItemExtension(extensions map[string][]Extension) *ITunesItemExtension {
|
||||||
|
entry := &ITunesItemExtension{}
|
||||||
|
entry.Author = parseTextExtension("author", extensions)
|
||||||
|
entry.Block = parseTextExtension("block", extensions)
|
||||||
|
entry.Duration = parseTextExtension("duration", extensions)
|
||||||
|
entry.Explicit = parseTextExtension("explicit", extensions)
|
||||||
|
entry.Subtitle = parseTextExtension("subtitle", extensions)
|
||||||
|
entry.Summary = parseTextExtension("summary", extensions)
|
||||||
|
entry.Keywords = parseTextExtension("keywords", extensions)
|
||||||
|
entry.Image = parseImage(extensions)
|
||||||
|
entry.IsClosedCaptioned = parseTextExtension("isClosedCaptioned", extensions)
|
||||||
|
entry.Order = parseTextExtension("order", extensions)
|
||||||
|
return entry
|
||||||
|
}
|
||||||
|
|
||||||
|
func parseImage(extensions map[string][]Extension) (image string) {
|
||||||
|
if extensions == nil {
|
||||||
|
return
|
||||||
|
}
|
||||||
|
|
||||||
|
matches, ok := extensions["image"]
|
||||||
|
if !ok || len(matches) == 0 {
|
||||||
|
return
|
||||||
|
}
|
||||||
|
|
||||||
|
image = matches[0].Attrs["href"]
|
||||||
|
return
|
||||||
|
}
|
||||||
|
|
||||||
|
func parseOwner(extensions map[string][]Extension) (owner *ITunesOwner) {
|
||||||
|
if extensions == nil {
|
||||||
|
return
|
||||||
|
}
|
||||||
|
|
||||||
|
matches, ok := extensions["owner"]
|
||||||
|
if !ok || len(matches) == 0 {
|
||||||
|
return
|
||||||
|
}
|
||||||
|
|
||||||
|
owner = &ITunesOwner{}
|
||||||
|
if name, ok := matches[0].Children["name"]; ok {
|
||||||
|
owner.Name = name[0].Value
|
||||||
|
}
|
||||||
|
if email, ok := matches[0].Children["email"]; ok {
|
||||||
|
owner.Email = email[0].Value
|
||||||
|
}
|
||||||
|
return
|
||||||
|
}
|
||||||
|
|
||||||
|
func parseCategories(extensions map[string][]Extension) (categories []*ITunesCategory) {
|
||||||
|
if extensions == nil {
|
||||||
|
return
|
||||||
|
}
|
||||||
|
|
||||||
|
matches, ok := extensions["category"]
|
||||||
|
if !ok || len(matches) == 0 {
|
||||||
|
return
|
||||||
|
}
|
||||||
|
|
||||||
|
categories = []*ITunesCategory{}
|
||||||
|
for _, cat := range matches {
|
||||||
|
c := &ITunesCategory{}
|
||||||
|
if text, ok := cat.Attrs["text"]; ok {
|
||||||
|
c.Text = text
|
||||||
|
}
|
||||||
|
|
||||||
|
if subs, ok := cat.Children["category"]; ok {
|
||||||
|
s := &ITunesCategory{}
|
||||||
|
if text, ok := subs[0].Attrs["text"]; ok {
|
||||||
|
s.Text = text
|
||||||
|
}
|
||||||
|
c.Subcategory = s
|
||||||
|
}
|
||||||
|
categories = append(categories, c)
|
||||||
|
}
|
||||||
|
return
|
||||||
|
}
|
84
vendor/github.com/mmcdole/gofeed/feed.go
generated
vendored
Normal file
84
vendor/github.com/mmcdole/gofeed/feed.go
generated
vendored
Normal file
|
@ -0,0 +1,84 @@
|
||||||
|
package gofeed
|
||||||
|
|
||||||
|
import (
|
||||||
|
"encoding/json"
|
||||||
|
"time"
|
||||||
|
|
||||||
|
"github.com/mmcdole/gofeed/extensions"
|
||||||
|
)
|
||||||
|
|
||||||
|
// Feed is the universal Feed type that atom.Feed
|
||||||
|
// and rss.Feed gets translated to. It represents
|
||||||
|
// a web feed.
|
||||||
|
type Feed struct {
|
||||||
|
Title string `json:"title,omitempty"`
|
||||||
|
Description string `json:"description,omitempty"`
|
||||||
|
Link string `json:"link,omitempty"`
|
||||||
|
FeedLink string `json:"feedLink,omitempty"`
|
||||||
|
Updated string `json:"updated,omitempty"`
|
||||||
|
UpdatedParsed *time.Time `json:"updatedParsed,omitempty"`
|
||||||
|
Published string `json:"published,omitempty"`
|
||||||
|
PublishedParsed *time.Time `json:"publishedParsed,omitempty"`
|
||||||
|
Author *Person `json:"author,omitempty"`
|
||||||
|
Language string `json:"language,omitempty"`
|
||||||
|
Image *Image `json:"image,omitempty"`
|
||||||
|
Copyright string `json:"copyright,omitempty"`
|
||||||
|
Generator string `json:"generator,omitempty"`
|
||||||
|
Categories []string `json:"categories,omitempty"`
|
||||||
|
DublinCoreExt *ext.DublinCoreExtension `json:"dcExt,omitempty"`
|
||||||
|
ITunesExt *ext.ITunesFeedExtension `json:"itunesExt,omitempty"`
|
||||||
|
Extensions ext.Extensions `json:"extensions,omitempty"`
|
||||||
|
Custom map[string]string `json:"custom,omitempty"`
|
||||||
|
Items []*Item `json:"items"`
|
||||||
|
FeedType string `json:"feedType"`
|
||||||
|
FeedVersion string `json:"feedVersion"`
|
||||||
|
}
|
||||||
|
|
||||||
|
func (f Feed) String() string {
|
||||||
|
json, _ := json.MarshalIndent(f, "", " ")
|
||||||
|
return string(json)
|
||||||
|
}
|
||||||
|
|
||||||
|
// Item is the universal Item type that atom.Entry
|
||||||
|
// and rss.Item gets translated to. It represents
|
||||||
|
// a single entry in a given feed.
|
||||||
|
type Item struct {
|
||||||
|
Title string `json:"title,omitempty"`
|
||||||
|
Description string `json:"description,omitempty"`
|
||||||
|
Content string `json:"content,omitempty"`
|
||||||
|
Link string `json:"link,omitempty"`
|
||||||
|
Updated string `json:"updated,omitempty"`
|
||||||
|
UpdatedParsed *time.Time `json:"updatedParsed,omitempty"`
|
||||||
|
Published string `json:"published,omitempty"`
|
||||||
|
PublishedParsed *time.Time `json:"publishedParsed,omitempty"`
|
||||||
|
Author *Person `json:"author,omitempty"`
|
||||||
|
GUID string `json:"guid,omitempty"`
|
||||||
|
Image *Image `json:"image,omitempty"`
|
||||||
|
Categories []string `json:"categories,omitempty"`
|
||||||
|
Enclosures []*Enclosure `json:"enclosures,omitempty"`
|
||||||
|
DublinCoreExt *ext.DublinCoreExtension `json:"dcExt,omitempty"`
|
||||||
|
ITunesExt *ext.ITunesItemExtension `json:"itunesExt,omitempty"`
|
||||||
|
Extensions ext.Extensions `json:"extensions,omitempty"`
|
||||||
|
Custom map[string]string `json:"custom,omitempty"`
|
||||||
|
}
|
||||||
|
|
||||||
|
// Person is an individual specified in a feed
|
||||||
|
// (e.g. an author)
|
||||||
|
type Person struct {
|
||||||
|
Name string `json:"name,omitempty"`
|
||||||
|
Email string `json:"email,omitempty"`
|
||||||
|
}
|
||||||
|
|
||||||
|
// Image is an image that is the artwork for a given
|
||||||
|
// feed or item.
|
||||||
|
type Image struct {
|
||||||
|
URL string `json:"url,omitempty"`
|
||||||
|
Title string `json:"title,omitempty"`
|
||||||
|
}
|
||||||
|
|
||||||
|
// Enclosure is a file associated with a given Item.
|
||||||
|
type Enclosure struct {
|
||||||
|
URL string `json:"url,omitempty"`
|
||||||
|
Length string `json:"length,omitempty"`
|
||||||
|
Type string `json:"type,omitempty"`
|
||||||
|
}
|
12
vendor/github.com/mmcdole/gofeed/go.mod
generated
vendored
Normal file
12
vendor/github.com/mmcdole/gofeed/go.mod
generated
vendored
Normal file
|
@ -0,0 +1,12 @@
|
||||||
|
module github.com/mmcdole/gofeed
|
||||||
|
|
||||||
|
require (
|
||||||
|
github.com/PuerkitoBio/goquery v1.5.0
|
||||||
|
github.com/codegangsta/cli v1.20.0
|
||||||
|
github.com/davecgh/go-spew v1.1.1 // indirect
|
||||||
|
github.com/mmcdole/goxpp v0.0.0-20181012175147-0068e33feabf
|
||||||
|
github.com/pmezard/go-difflib v1.0.0 // indirect
|
||||||
|
github.com/stretchr/testify v1.2.2
|
||||||
|
golang.org/x/net v0.0.0-20181220203305-927f97764cc3
|
||||||
|
golang.org/x/text v0.3.0
|
||||||
|
)
|
20
vendor/github.com/mmcdole/gofeed/go.sum
generated
vendored
Normal file
20
vendor/github.com/mmcdole/gofeed/go.sum
generated
vendored
Normal file
|
@ -0,0 +1,20 @@
|
||||||
|
github.com/PuerkitoBio/goquery v1.5.0 h1:uGvmFXOA73IKluu/F84Xd1tt/z07GYm8X49XKHP7EJk=
|
||||||
|
github.com/PuerkitoBio/goquery v1.5.0/go.mod h1:qD2PgZ9lccMbQlc7eEOjaeRlFQON7xY8kdmcsrnKqMg=
|
||||||
|
github.com/andybalholm/cascadia v1.0.0 h1:hOCXnnZ5A+3eVDX8pvgl4kofXv2ELss0bKcqRySc45o=
|
||||||
|
github.com/andybalholm/cascadia v1.0.0/go.mod h1:GsXiBklL0woXo1j/WYWtSYYC4ouU9PqHO0sqidkEA4Y=
|
||||||
|
github.com/codegangsta/cli v1.20.0 h1:iX1FXEgwzd5+XN6wk5cVHOGQj6Q3Dcp20lUeS4lHNTw=
|
||||||
|
github.com/codegangsta/cli v1.20.0/go.mod h1:/qJNoX69yVSKu5o4jLyXAENLRyk1uhi7zkbQ3slBdOA=
|
||||||
|
github.com/davecgh/go-spew v1.1.1 h1:vj9j/u1bqnvCEfJOwUhtlOARqs3+rkHYY13jYWTU97c=
|
||||||
|
github.com/davecgh/go-spew v1.1.1/go.mod h1:J7Y8YcW2NihsgmVo/mv3lAwl/skON4iLHjSsI+c5H38=
|
||||||
|
github.com/mmcdole/goxpp v0.0.0-20181012175147-0068e33feabf h1:sWGE2v+hO0Nd4yFU/S/mDBM5plIU8v/Qhfz41hkDIAI=
|
||||||
|
github.com/mmcdole/goxpp v0.0.0-20181012175147-0068e33feabf/go.mod h1:pasqhqstspkosTneA62Nc+2p9SOBBYAPbnmRRWPQ0V8=
|
||||||
|
github.com/pmezard/go-difflib v1.0.0 h1:4DBwDE0NGyQoBHbLQYPwSUPoCMWR5BEzIk/f1lZbAQM=
|
||||||
|
github.com/pmezard/go-difflib v1.0.0/go.mod h1:iKH77koFhYxTK1pcRnkKkqfTogsbg7gZNVY4sRDYZ/4=
|
||||||
|
github.com/stretchr/testify v1.2.2 h1:bSDNvY7ZPG5RlJ8otE/7V6gMiyenm9RtJ7IUVIAoJ1w=
|
||||||
|
github.com/stretchr/testify v1.2.2/go.mod h1:a8OnRcib4nhh0OaRAV+Yts87kKdq0PP7pXfy6kDkUVs=
|
||||||
|
golang.org/x/net v0.0.0-20180218175443-cbe0f9307d01/go.mod h1:mL1N/T3taQHkDXs73rZJwtUhF3w3ftmwwsq0BUmARs4=
|
||||||
|
golang.org/x/net v0.0.0-20181114220301-adae6a3d119a/go.mod h1:mL1N/T3taQHkDXs73rZJwtUhF3w3ftmwwsq0BUmARs4=
|
||||||
|
golang.org/x/net v0.0.0-20181220203305-927f97764cc3 h1:eH6Eip3UpmR+yM/qI9Ijluzb1bNv/cAU/n+6l8tRSis=
|
||||||
|
golang.org/x/net v0.0.0-20181220203305-927f97764cc3/go.mod h1:mL1N/T3taQHkDXs73rZJwtUhF3w3ftmwwsq0BUmARs4=
|
||||||
|
golang.org/x/text v0.3.0 h1:g61tztE5qeGQ89tm6NTjjM9VPIm088od1l6aSorWRWg=
|
||||||
|
golang.org/x/text v0.3.0/go.mod h1:NqM8EUOU14njkJ3fqMW+pc6Ldnwhi/IjpwHt7yyuwOQ=
|
19
vendor/github.com/mmcdole/gofeed/internal/shared/charsetconv.go
generated
vendored
Normal file
19
vendor/github.com/mmcdole/gofeed/internal/shared/charsetconv.go
generated
vendored
Normal file
|
@ -0,0 +1,19 @@
|
||||||
|
package shared
|
||||||
|
|
||||||
|
import (
|
||||||
|
"io"
|
||||||
|
|
||||||
|
"golang.org/x/net/html/charset"
|
||||||
|
)
|
||||||
|
|
||||||
|
func NewReaderLabel(label string, input io.Reader) (io.Reader, error) {
|
||||||
|
conv, err := charset.NewReaderLabel(label, input)
|
||||||
|
|
||||||
|
if err != nil {
|
||||||
|
return nil, err
|
||||||
|
}
|
||||||
|
|
||||||
|
// Wrap the charset decoder reader with a XML sanitizer
|
||||||
|
//clean := NewXMLSanitizerReader(conv)
|
||||||
|
return conv, nil
|
||||||
|
}
|
219
vendor/github.com/mmcdole/gofeed/internal/shared/dateparser.go
generated
vendored
Normal file
219
vendor/github.com/mmcdole/gofeed/internal/shared/dateparser.go
generated
vendored
Normal file
|
@ -0,0 +1,219 @@
|
||||||
|
package shared
|
||||||
|
|
||||||
|
import (
|
||||||
|
"fmt"
|
||||||
|
"strings"
|
||||||
|
"time"
|
||||||
|
)
|
||||||
|
|
||||||
|
// DateFormats taken from github.com/mjibson/goread
|
||||||
|
var dateFormats = []string{
|
||||||
|
time.RFC822, // RSS
|
||||||
|
time.RFC822Z, // RSS
|
||||||
|
time.RFC3339, // Atom
|
||||||
|
time.UnixDate,
|
||||||
|
time.RubyDate,
|
||||||
|
time.RFC850,
|
||||||
|
time.RFC1123Z,
|
||||||
|
time.RFC1123,
|
||||||
|
time.ANSIC,
|
||||||
|
"Mon, January 2 2006 15:04:05 -0700",
|
||||||
|
"Mon, Jan 2 2006 15:04:05 -700",
|
||||||
|
"Mon, Jan 2 2006 15:04:05 -0700",
|
||||||
|
"Mon Jan 2 15:04 2006",
|
||||||
|
"Mon Jan 02, 2006 3:04 pm",
|
||||||
|
"Mon Jan 02 2006 15:04:05 -0700",
|
||||||
|
"Monday, January 2, 2006 03:04 PM",
|
||||||
|
"Monday, January 2, 2006",
|
||||||
|
"Monday, January 02, 2006",
|
||||||
|
"Monday, 2 January 2006 15:04:05 -0700",
|
||||||
|
"Monday, 2 Jan 2006 15:04:05 -0700",
|
||||||
|
"Monday, 02 January 2006 15:04:05 -0700",
|
||||||
|
"Monday, 02 January 2006 15:04:05",
|
||||||
|
"Mon, 2 January 2006, 15:04 -0700",
|
||||||
|
"Mon, 2 January 2006 15:04:05 -0700",
|
||||||
|
"Mon, 2 January 2006",
|
||||||
|
"Mon, 2 Jan 2006 3:04:05 PM -0700",
|
||||||
|
"Mon, 2 Jan 2006 15:4:5 -0700 GMT",
|
||||||
|
"Mon, 2, Jan 2006 15:4",
|
||||||
|
"Mon, 2 Jan 2006, 15:04 -0700",
|
||||||
|
"Mon, 2 Jan 2006 15:04 -0700",
|
||||||
|
"Mon, 2 Jan 2006 15:04:05 UT",
|
||||||
|
"Mon, 2 Jan 2006 15:04:05 -0700 MST",
|
||||||
|
"Mon, 2 Jan 2006 15:04:05-0700",
|
||||||
|
"Mon, 2 Jan 2006 15:04:05 -0700",
|
||||||
|
"Mon, 2 Jan 2006 15:04:05",
|
||||||
|
"Mon, 2 Jan 2006 15:04",
|
||||||
|
"Mon,2 Jan 2006",
|
||||||
|
"Mon, 2 Jan 2006",
|
||||||
|
"Mon, 2 Jan 06 15:04:05 -0700",
|
||||||
|
"Mon, 2006-01-02 15:04",
|
||||||
|
"Mon, 02 January 2006",
|
||||||
|
"Mon, 02 Jan 2006 15 -0700",
|
||||||
|
"Mon, 02 Jan 2006 15:04 -0700",
|
||||||
|
"Mon, 02 Jan 2006 15:04:05 Z",
|
||||||
|
"Mon, 02 Jan 2006 15:04:05 UT",
|
||||||
|
"Mon, 02 Jan 2006 15:04:05 MST-07:00",
|
||||||
|
"Mon, 02 Jan 2006 15:04:05 MST -0700",
|
||||||
|
"Mon, 02 Jan 2006 15:04:05 GMT-0700",
|
||||||
|
"Mon,02 Jan 2006 15:04:05 -0700",
|
||||||
|
"Mon, 02 Jan 2006 15:04:05 -0700",
|
||||||
|
"Mon, 02 Jan 2006 15:04:05 -07:00",
|
||||||
|
"Mon, 02 Jan 2006 15:04:05 --0700",
|
||||||
|
"Mon 02 Jan 2006 15:04:05 -0700",
|
||||||
|
"Mon, 02 Jan 2006 15:04:05 -07",
|
||||||
|
"Mon, 02 Jan 2006 15:04:05 00",
|
||||||
|
"Mon, 02 Jan 2006 15:04:05",
|
||||||
|
"Mon, 02 Jan 2006",
|
||||||
|
"January 2, 2006 3:04 PM",
|
||||||
|
"January 2, 2006, 3:04 p.m.",
|
||||||
|
"January 2, 2006 15:04:05",
|
||||||
|
"January 2, 2006 03:04 PM",
|
||||||
|
"January 2, 2006",
|
||||||
|
"January 02, 2006 15:04",
|
||||||
|
"January 02, 2006 03:04 PM",
|
||||||
|
"January 02, 2006",
|
||||||
|
"Jan 2, 2006 3:04:05 PM",
|
||||||
|
"Jan 2, 2006",
|
||||||
|
"Jan 02 2006 03:04:05PM",
|
||||||
|
"Jan 02, 2006",
|
||||||
|
"6/1/2 15:04",
|
||||||
|
"6-1-2 15:04",
|
||||||
|
"2 January 2006 15:04:05 -0700",
|
||||||
|
"2 January 2006",
|
||||||
|
"2 Jan 2006 15:04:05 Z",
|
||||||
|
"2 Jan 2006 15:04:05 -0700",
|
||||||
|
"2 Jan 2006",
|
||||||
|
"2.1.2006 15:04:05",
|
||||||
|
"2/1/2006",
|
||||||
|
"2-1-2006",
|
||||||
|
"2006 January 02",
|
||||||
|
"2006-1-2T15:04:05Z",
|
||||||
|
"2006-1-2 15:04:05",
|
||||||
|
"2006-1-2",
|
||||||
|
"2006-1-02T15:04:05Z",
|
||||||
|
"2006-01-02T15:04Z",
|
||||||
|
"2006-01-02T15:04-07:00",
|
||||||
|
"2006-01-02T15:04:05Z",
|
||||||
|
"2006-01-02T15:04:05-07:00:00",
|
||||||
|
"2006-01-02T15:04:05:-0700",
|
||||||
|
"2006-01-02T15:04:05-0700",
|
||||||
|
"2006-01-02T15:04:05-07:00",
|
||||||
|
"2006-01-02T15:04:05 -0700",
|
||||||
|
"2006-01-02T15:04:05:00",
|
||||||
|
"2006-01-02T15:04:05",
|
||||||
|
"2006-01-02 at 15:04:05",
|
||||||
|
"2006-01-02 15:04:05Z",
|
||||||
|
"2006-01-02 15:04:05-0700",
|
||||||
|
"2006-01-02 15:04:05-07:00",
|
||||||
|
"2006-01-02 15:04:05 -0700",
|
||||||
|
"2006-01-02 15:04",
|
||||||
|
"2006-01-02 00:00:00.0 15:04:05.0 -0700",
|
||||||
|
"2006/01/02",
|
||||||
|
"2006-01-02",
|
||||||
|
"15:04 02.01.2006 -0700",
|
||||||
|
"1/2/2006 3:04:05 PM",
|
||||||
|
"1/2/2006",
|
||||||
|
"06/1/2 15:04",
|
||||||
|
"06-1-2 15:04",
|
||||||
|
"02 Monday, Jan 2006 15:04",
|
||||||
|
"02 Jan 2006 15:04:05 UT",
|
||||||
|
"02 Jan 2006 15:04:05 -0700",
|
||||||
|
"02 Jan 2006 15:04:05",
|
||||||
|
"02 Jan 2006",
|
||||||
|
"02.01.2006 15:04:05",
|
||||||
|
"02/01/2006 15:04:05",
|
||||||
|
"02.01.2006 15:04",
|
||||||
|
"02/01/2006 - 15:04",
|
||||||
|
"02.01.2006 -0700",
|
||||||
|
"02/01/2006",
|
||||||
|
"02-01-2006",
|
||||||
|
"01/02/2006 3:04 PM",
|
||||||
|
"01/02/2006 - 15:04",
|
||||||
|
"01/02/2006",
|
||||||
|
"01-02-2006",
|
||||||
|
}
|
||||||
|
|
||||||
|
// Named zone cannot be consistently loaded, so handle separately
|
||||||
|
var dateFormatsWithNamedZone = []string{
|
||||||
|
"Mon, January 02, 2006, 15:04:05 MST",
|
||||||
|
"Mon, January 02, 2006 15:04:05 MST",
|
||||||
|
"Mon, Jan 2, 2006 15:04 MST",
|
||||||
|
"Mon, Jan 2 2006 15:04 MST",
|
||||||
|
"Mon, Jan 2, 2006 15:04:05 MST",
|
||||||
|
"Mon Jan 2 15:04:05 2006 MST",
|
||||||
|
"Mon, Jan 02,2006 15:04:05 MST",
|
||||||
|
"Monday, January 2, 2006 15:04:05 MST",
|
||||||
|
"Monday, 2 January 2006 15:04:05 MST",
|
||||||
|
"Monday, 2 Jan 2006 15:04:05 MST",
|
||||||
|
"Monday, 02 January 2006 15:04:05 MST",
|
||||||
|
"Mon, 2 January 2006 15:04 MST",
|
||||||
|
"Mon, 2 January 2006, 15:04:05 MST",
|
||||||
|
"Mon, 2 January 2006 15:04:05 MST",
|
||||||
|
"Mon, 2 Jan 2006 15:4:5 MST",
|
||||||
|
"Mon, 2 Jan 2006 15:04 MST",
|
||||||
|
"Mon, 2 Jan 2006 15:04:05MST",
|
||||||
|
"Mon, 2 Jan 2006 15:04:05 MST",
|
||||||
|
"Mon 2 Jan 2006 15:04:05 MST",
|
||||||
|
"mon,2 Jan 2006 15:04:05 MST",
|
||||||
|
"Mon, 2 Jan 15:04:05 MST",
|
||||||
|
"Mon, 2 Jan 06 15:04:05 MST",
|
||||||
|
"Mon,02 January 2006 14:04:05 MST",
|
||||||
|
"Mon, 02 Jan 2006 3:04:05 PM MST",
|
||||||
|
"Mon,02 Jan 2006 15:04 MST",
|
||||||
|
"Mon, 02 Jan 2006 15:04 MST",
|
||||||
|
"Mon, 02 Jan 2006, 15:04:05 MST",
|
||||||
|
"Mon, 02 Jan 2006 15:04:05MST",
|
||||||
|
"Mon, 02 Jan 2006 15:04:05 MST",
|
||||||
|
"Mon , 02 Jan 2006 15:04:05 MST",
|
||||||
|
"Mon, 02 Jan 06 15:04:05 MST",
|
||||||
|
"January 2, 2006 15:04:05 MST",
|
||||||
|
"January 02, 2006 15:04:05 MST",
|
||||||
|
"Jan 2, 2006 3:04:05 PM MST",
|
||||||
|
"Jan 2, 2006 15:04:05 MST",
|
||||||
|
"2 January 2006 15:04:05 MST",
|
||||||
|
"2 Jan 2006 15:04:05 MST",
|
||||||
|
"2006-01-02 15:04:05 MST",
|
||||||
|
"1/2/2006 3:04:05 PM MST",
|
||||||
|
"1/2/2006 15:04:05 MST",
|
||||||
|
"02 Jan 2006 15:04 MST",
|
||||||
|
"02 Jan 2006 15:04:05 MST",
|
||||||
|
"02/01/2006 15:04 MST",
|
||||||
|
"02-01-2006 15:04:05 MST",
|
||||||
|
"01/02/2006 15:04:05 MST",
|
||||||
|
}
|
||||||
|
|
||||||
|
// ParseDate parses a given date string using a large
|
||||||
|
// list of commonly found feed date formats.
|
||||||
|
func ParseDate(ds string) (t time.Time, err error) {
|
||||||
|
d := strings.TrimSpace(ds)
|
||||||
|
if d == "" {
|
||||||
|
return t, fmt.Errorf("Date string is empty")
|
||||||
|
}
|
||||||
|
for _, f := range dateFormats {
|
||||||
|
if t, err = time.Parse(f, d); err == nil {
|
||||||
|
return
|
||||||
|
}
|
||||||
|
}
|
||||||
|
for _, f := range dateFormatsWithNamedZone {
|
||||||
|
t, err = time.Parse(f, d)
|
||||||
|
if err != nil {
|
||||||
|
continue
|
||||||
|
}
|
||||||
|
|
||||||
|
// This is a format match! Now try to load the timezone name
|
||||||
|
loc, err := time.LoadLocation(t.Location().String())
|
||||||
|
if err != nil {
|
||||||
|
// We couldn't load the TZ name. Just use UTC instead...
|
||||||
|
return t, nil
|
||||||
|
}
|
||||||
|
|
||||||
|
if t, err = time.ParseInLocation(f, ds, loc); err == nil {
|
||||||
|
return t, nil
|
||||||
|
}
|
||||||
|
// This should not be reachable
|
||||||
|
}
|
||||||
|
|
||||||
|
err = fmt.Errorf("Failed to parse date: %s", ds)
|
||||||
|
return
|
||||||
|
}
|
176
vendor/github.com/mmcdole/gofeed/internal/shared/extparser.go
generated
vendored
Normal file
176
vendor/github.com/mmcdole/gofeed/internal/shared/extparser.go
generated
vendored
Normal file
|
@ -0,0 +1,176 @@
|
||||||
|
package shared
|
||||||
|
|
||||||
|
import (
|
||||||
|
"strings"
|
||||||
|
|
||||||
|
"github.com/mmcdole/gofeed/extensions"
|
||||||
|
"github.com/mmcdole/goxpp"
|
||||||
|
)
|
||||||
|
|
||||||
|
// IsExtension returns whether or not the current
|
||||||
|
// XML element is an extension element (if it has a
|
||||||
|
// non empty prefix)
|
||||||
|
func IsExtension(p *xpp.XMLPullParser) bool {
|
||||||
|
space := strings.TrimSpace(p.Space)
|
||||||
|
if prefix, ok := p.Spaces[space]; ok {
|
||||||
|
return !(prefix == "" || prefix == "rss" || prefix == "rdf" || prefix == "content")
|
||||||
|
}
|
||||||
|
|
||||||
|
return p.Space != ""
|
||||||
|
}
|
||||||
|
|
||||||
|
// ParseExtension parses the current element of the
|
||||||
|
// XMLPullParser as an extension element and updates
|
||||||
|
// the extension map
|
||||||
|
func ParseExtension(fe ext.Extensions, p *xpp.XMLPullParser) (ext.Extensions, error) {
|
||||||
|
prefix := prefixForNamespace(p.Space, p)
|
||||||
|
|
||||||
|
result, err := parseExtensionElement(p)
|
||||||
|
if err != nil {
|
||||||
|
return nil, err
|
||||||
|
}
|
||||||
|
|
||||||
|
// Ensure the extension prefix map exists
|
||||||
|
if _, ok := fe[prefix]; !ok {
|
||||||
|
fe[prefix] = map[string][]ext.Extension{}
|
||||||
|
}
|
||||||
|
// Ensure the extension element slice exists
|
||||||
|
if _, ok := fe[prefix][p.Name]; !ok {
|
||||||
|
fe[prefix][p.Name] = []ext.Extension{}
|
||||||
|
}
|
||||||
|
|
||||||
|
fe[prefix][p.Name] = append(fe[prefix][p.Name], result)
|
||||||
|
return fe, nil
|
||||||
|
}
|
||||||
|
|
||||||
|
func parseExtensionElement(p *xpp.XMLPullParser) (e ext.Extension, err error) {
|
||||||
|
if err = p.Expect(xpp.StartTag, "*"); err != nil {
|
||||||
|
return e, err
|
||||||
|
}
|
||||||
|
|
||||||
|
e.Name = p.Name
|
||||||
|
e.Children = map[string][]ext.Extension{}
|
||||||
|
e.Attrs = map[string]string{}
|
||||||
|
|
||||||
|
for _, attr := range p.Attrs {
|
||||||
|
// TODO: Alright that we are stripping
|
||||||
|
// namespace information from attributes ?
|
||||||
|
e.Attrs[attr.Name.Local] = attr.Value
|
||||||
|
}
|
||||||
|
|
||||||
|
for {
|
||||||
|
tok, err := p.Next()
|
||||||
|
if err != nil {
|
||||||
|
return e, err
|
||||||
|
}
|
||||||
|
|
||||||
|
if tok == xpp.EndTag {
|
||||||
|
break
|
||||||
|
}
|
||||||
|
|
||||||
|
if tok == xpp.StartTag {
|
||||||
|
child, err := parseExtensionElement(p)
|
||||||
|
if err != nil {
|
||||||
|
return e, err
|
||||||
|
}
|
||||||
|
|
||||||
|
if _, ok := e.Children[child.Name]; !ok {
|
||||||
|
e.Children[child.Name] = []ext.Extension{}
|
||||||
|
}
|
||||||
|
|
||||||
|
e.Children[child.Name] = append(e.Children[child.Name], child)
|
||||||
|
} else if tok == xpp.Text {
|
||||||
|
e.Value += p.Text
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
e.Value = strings.TrimSpace(e.Value)
|
||||||
|
|
||||||
|
if err = p.Expect(xpp.EndTag, e.Name); err != nil {
|
||||||
|
return e, err
|
||||||
|
}
|
||||||
|
|
||||||
|
return e, nil
|
||||||
|
}
|
||||||
|
|
||||||
|
func prefixForNamespace(space string, p *xpp.XMLPullParser) string {
|
||||||
|
// First we check if the global namespace map
|
||||||
|
// contains an entry for this namespace/prefix.
|
||||||
|
// This way we can use the canonical prefix for this
|
||||||
|
// ns instead of the one defined in the feed.
|
||||||
|
if prefix, ok := canonicalNamespaces[space]; ok {
|
||||||
|
return prefix
|
||||||
|
}
|
||||||
|
|
||||||
|
// Next we check if the feed itself defined this
|
||||||
|
// this namespace and return it if we have a result.
|
||||||
|
if prefix, ok := p.Spaces[space]; ok {
|
||||||
|
return prefix
|
||||||
|
}
|
||||||
|
|
||||||
|
// Lastly, any namespace which is not defined in the
|
||||||
|
// the feed will be the prefix itself when using Go's
|
||||||
|
// xml.Decoder.Token() method.
|
||||||
|
return space
|
||||||
|
}
|
||||||
|
|
||||||
|
// Namespaces taken from github.com/kurtmckee/feedparser
|
||||||
|
// These are used for determining canonical name space prefixes
|
||||||
|
// for many of the popular RSS/Atom extensions.
|
||||||
|
//
|
||||||
|
// These canonical prefixes override any prefixes used in the feed itself.
|
||||||
|
var canonicalNamespaces = map[string]string{
|
||||||
|
"http://webns.net/mvcb/": "admin",
|
||||||
|
"http://purl.org/rss/1.0/modules/aggregation/": "ag",
|
||||||
|
"http://purl.org/rss/1.0/modules/annotate/": "annotate",
|
||||||
|
"http://media.tangent.org/rss/1.0/": "audio",
|
||||||
|
"http://backend.userland.com/blogChannelModule": "blogChannel",
|
||||||
|
"http://creativecommons.org/ns#license": "cc",
|
||||||
|
"http://web.resource.org/cc/": "cc",
|
||||||
|
"http://cyber.law.harvard.edu/rss/creativeCommonsRssModule.html": "creativeCommons",
|
||||||
|
"http://backend.userland.com/creativeCommonsRssModule": "creativeCommons",
|
||||||
|
"http://purl.org/rss/1.0/modules/company": "co",
|
||||||
|
"http://purl.org/rss/1.0/modules/content/": "content",
|
||||||
|
"http://my.theinfo.org/changed/1.0/rss/": "cp",
|
||||||
|
"http://purl.org/dc/elements/1.1/": "dc",
|
||||||
|
"http://purl.org/dc/terms/": "dcterms",
|
||||||
|
"http://purl.org/rss/1.0/modules/email/": "email",
|
||||||
|
"http://purl.org/rss/1.0/modules/event/": "ev",
|
||||||
|
"http://rssnamespace.org/feedburner/ext/1.0": "feedburner",
|
||||||
|
"http://freshmeat.net/rss/fm/": "fm",
|
||||||
|
"http://xmlns.com/foaf/0.1/": "foaf",
|
||||||
|
"http://www.w3.org/2003/01/geo/wgs84_pos#": "geo",
|
||||||
|
"http://www.georss.org/georss": "georss",
|
||||||
|
"http://www.opengis.net/gml": "gml",
|
||||||
|
"http://postneo.com/icbm/": "icbm",
|
||||||
|
"http://purl.org/rss/1.0/modules/image/": "image",
|
||||||
|
"http://www.itunes.com/DTDs/PodCast-1.0.dtd": "itunes",
|
||||||
|
"http://example.com/DTDs/PodCast-1.0.dtd": "itunes",
|
||||||
|
"http://purl.org/rss/1.0/modules/link/": "l",
|
||||||
|
"http://search.yahoo.com/mrss": "media",
|
||||||
|
"http://search.yahoo.com/mrss/": "media",
|
||||||
|
"http://madskills.com/public/xml/rss/module/pingback/": "pingback",
|
||||||
|
"http://prismstandard.org/namespaces/1.2/basic/": "prism",
|
||||||
|
"http://www.w3.org/1999/02/22-rdf-syntax-ns#": "rdf",
|
||||||
|
"http://www.w3.org/2000/01/rdf-schema#": "rdfs",
|
||||||
|
"http://purl.org/rss/1.0/modules/reference/": "ref",
|
||||||
|
"http://purl.org/rss/1.0/modules/richequiv/": "reqv",
|
||||||
|
"http://purl.org/rss/1.0/modules/search/": "search",
|
||||||
|
"http://purl.org/rss/1.0/modules/slash/": "slash",
|
||||||
|
"http://schemas.xmlsoap.org/soap/envelope/": "soap",
|
||||||
|
"http://purl.org/rss/1.0/modules/servicestatus/": "ss",
|
||||||
|
"http://hacks.benhammersley.com/rss/streaming/": "str",
|
||||||
|
"http://purl.org/rss/1.0/modules/subscription/": "sub",
|
||||||
|
"http://purl.org/rss/1.0/modules/syndication/": "sy",
|
||||||
|
"http://schemas.pocketsoap.com/rss/myDescModule/": "szf",
|
||||||
|
"http://purl.org/rss/1.0/modules/taxonomy/": "taxo",
|
||||||
|
"http://purl.org/rss/1.0/modules/threading/": "thr",
|
||||||
|
"http://purl.org/rss/1.0/modules/textinput/": "ti",
|
||||||
|
"http://madskills.com/public/xml/rss/module/trackback/": "trackback",
|
||||||
|
"http://wellformedweb.org/commentAPI/": "wfw",
|
||||||
|
"http://purl.org/rss/1.0/modules/wiki/": "wiki",
|
||||||
|
"http://www.w3.org/1999/xhtml": "xhtml",
|
||||||
|
"http://www.w3.org/1999/xlink": "xlink",
|
||||||
|
"http://www.w3.org/XML/1998/namespace": "xml",
|
||||||
|
"http://podlove.org/simple-chapters": "psc",
|
||||||
|
}
|
258
vendor/github.com/mmcdole/gofeed/internal/shared/xmlbase.go
generated
vendored
Normal file
258
vendor/github.com/mmcdole/gofeed/internal/shared/xmlbase.go
generated
vendored
Normal file
|
@ -0,0 +1,258 @@
|
||||||
|
package shared
|
||||||
|
|
||||||
|
import (
|
||||||
|
"bytes"
|
||||||
|
"fmt"
|
||||||
|
"golang.org/x/net/html"
|
||||||
|
"net/url"
|
||||||
|
"strings"
|
||||||
|
|
||||||
|
"github.com/mmcdole/goxpp"
|
||||||
|
)
|
||||||
|
|
||||||
|
var (
|
||||||
|
// HTML attributes which contain URIs
|
||||||
|
// https://pythonhosted.org/feedparser/resolving-relative-links.html
|
||||||
|
// To catch every possible URI attribute is non-trivial:
|
||||||
|
// https://stackoverflow.com/questions/2725156/complete-list-of-html-tag-attributes-which-have-a-url-value
|
||||||
|
htmlURIAttrs = map[string]bool{
|
||||||
|
"action": true,
|
||||||
|
"background": true,
|
||||||
|
"cite": true,
|
||||||
|
"codebase": true,
|
||||||
|
"data": true,
|
||||||
|
"href": true,
|
||||||
|
"poster": true,
|
||||||
|
"profile": true,
|
||||||
|
"scheme": true,
|
||||||
|
"src": true,
|
||||||
|
"uri": true,
|
||||||
|
"usemap": true,
|
||||||
|
}
|
||||||
|
)
|
||||||
|
|
||||||
|
type urlStack []*url.URL
|
||||||
|
|
||||||
|
func (s *urlStack) push(u *url.URL) {
|
||||||
|
*s = append([]*url.URL{u}, *s...)
|
||||||
|
}
|
||||||
|
|
||||||
|
func (s *urlStack) pop() *url.URL {
|
||||||
|
if s == nil || len(*s) == 0 {
|
||||||
|
return nil
|
||||||
|
}
|
||||||
|
var top *url.URL
|
||||||
|
top, *s = (*s)[0], (*s)[1:]
|
||||||
|
return top
|
||||||
|
}
|
||||||
|
|
||||||
|
func (s *urlStack) top() *url.URL {
|
||||||
|
if s == nil || len(*s) == 0 {
|
||||||
|
return nil
|
||||||
|
}
|
||||||
|
return (*s)[0]
|
||||||
|
}
|
||||||
|
|
||||||
|
type XMLBase struct {
|
||||||
|
stack urlStack
|
||||||
|
URIAttrs map[string]bool
|
||||||
|
}
|
||||||
|
|
||||||
|
// FindRoot iterates through the tokens of an xml document until
|
||||||
|
// it encounters its first StartTag event. It returns an error
|
||||||
|
// if it reaches EndDocument before finding a tag.
|
||||||
|
func (b *XMLBase) FindRoot(p *xpp.XMLPullParser) (event xpp.XMLEventType, err error) {
|
||||||
|
for {
|
||||||
|
event, err = b.NextTag(p)
|
||||||
|
if err != nil {
|
||||||
|
return event, err
|
||||||
|
}
|
||||||
|
if event == xpp.StartTag {
|
||||||
|
break
|
||||||
|
}
|
||||||
|
|
||||||
|
if event == xpp.EndDocument {
|
||||||
|
return event, fmt.Errorf("Failed to find root node before document end.")
|
||||||
|
}
|
||||||
|
}
|
||||||
|
return
|
||||||
|
}
|
||||||
|
|
||||||
|
// XMLBase.NextTag iterates through the tokens until it reaches a StartTag or
|
||||||
|
// EndTag It maintains the urlStack upon encountering StartTag and EndTags, so
|
||||||
|
// that the top of the stack (accessible through the CurrentBase() and
|
||||||
|
// CurrentBaseURL() methods) is the absolute base URI by which relative URIs
|
||||||
|
// should be resolved.
|
||||||
|
//
|
||||||
|
// NextTag is similar to goxpp's NextTag method except it wont throw an error
|
||||||
|
// if the next immediate token isnt a Start/EndTag. Instead, it will continue
|
||||||
|
// to consume tokens until it hits a Start/EndTag or EndDocument.
|
||||||
|
func (b *XMLBase) NextTag(p *xpp.XMLPullParser) (event xpp.XMLEventType, err error) {
|
||||||
|
for {
|
||||||
|
|
||||||
|
if p.Event == xpp.EndTag {
|
||||||
|
// Pop xml:base after each end tag
|
||||||
|
b.pop()
|
||||||
|
}
|
||||||
|
|
||||||
|
event, err = p.Next()
|
||||||
|
if err != nil {
|
||||||
|
return event, err
|
||||||
|
}
|
||||||
|
|
||||||
|
if event == xpp.EndTag {
|
||||||
|
break
|
||||||
|
}
|
||||||
|
|
||||||
|
if event == xpp.StartTag {
|
||||||
|
base := parseBase(p)
|
||||||
|
err = b.push(base)
|
||||||
|
if err != nil {
|
||||||
|
return
|
||||||
|
}
|
||||||
|
|
||||||
|
err = b.resolveAttrs(p)
|
||||||
|
if err != nil {
|
||||||
|
return
|
||||||
|
}
|
||||||
|
|
||||||
|
break
|
||||||
|
}
|
||||||
|
|
||||||
|
if event == xpp.EndDocument {
|
||||||
|
return event, fmt.Errorf("Failed to find NextTag before reaching the end of the document.")
|
||||||
|
}
|
||||||
|
|
||||||
|
}
|
||||||
|
return
|
||||||
|
}
|
||||||
|
|
||||||
|
func parseBase(p *xpp.XMLPullParser) string {
|
||||||
|
xmlURI := "http://www.w3.org/XML/1998/namespace"
|
||||||
|
for _, attr := range p.Attrs {
|
||||||
|
if attr.Name.Local == "base" && attr.Name.Space == xmlURI {
|
||||||
|
return attr.Value
|
||||||
|
}
|
||||||
|
}
|
||||||
|
return ""
|
||||||
|
}
|
||||||
|
|
||||||
|
func (b *XMLBase) push(base string) error {
|
||||||
|
newURL, err := url.Parse(base)
|
||||||
|
if err != nil {
|
||||||
|
return err
|
||||||
|
}
|
||||||
|
|
||||||
|
topURL := b.CurrentBaseURL()
|
||||||
|
if topURL != nil {
|
||||||
|
newURL = topURL.ResolveReference(newURL)
|
||||||
|
}
|
||||||
|
b.stack.push(newURL)
|
||||||
|
return nil
|
||||||
|
}
|
||||||
|
|
||||||
|
// returns the popped base URL
|
||||||
|
func (b *XMLBase) pop() string {
|
||||||
|
url := b.stack.pop()
|
||||||
|
if url != nil {
|
||||||
|
return url.String()
|
||||||
|
}
|
||||||
|
return ""
|
||||||
|
}
|
||||||
|
|
||||||
|
func (b *XMLBase) CurrentBaseURL() *url.URL {
|
||||||
|
return b.stack.top()
|
||||||
|
}
|
||||||
|
|
||||||
|
func (b *XMLBase) CurrentBase() string {
|
||||||
|
if url := b.CurrentBaseURL(); url != nil {
|
||||||
|
return url.String()
|
||||||
|
}
|
||||||
|
return ""
|
||||||
|
}
|
||||||
|
|
||||||
|
// resolve the given string as a URL relative to current base
|
||||||
|
func (b *XMLBase) ResolveURL(u string) (string, error) {
|
||||||
|
if b.CurrentBase() == "" {
|
||||||
|
return u, nil
|
||||||
|
}
|
||||||
|
|
||||||
|
relURL, err := url.Parse(u)
|
||||||
|
if err != nil {
|
||||||
|
return u, err
|
||||||
|
}
|
||||||
|
curr := b.CurrentBaseURL()
|
||||||
|
if curr.Path != "" && u != "" && curr.Path[len(curr.Path)-1] != '/' {
|
||||||
|
// There's no reason someone would use a path in xml:base if they
|
||||||
|
// didn't mean for it to be a directory
|
||||||
|
curr.Path = curr.Path + "/"
|
||||||
|
}
|
||||||
|
absURL := b.CurrentBaseURL().ResolveReference(relURL)
|
||||||
|
return absURL.String(), nil
|
||||||
|
}
|
||||||
|
|
||||||
|
// resolve relative URI attributes according to xml:base
|
||||||
|
func (b *XMLBase) resolveAttrs(p *xpp.XMLPullParser) error {
|
||||||
|
for i, attr := range p.Attrs {
|
||||||
|
lowerName := strings.ToLower(attr.Name.Local)
|
||||||
|
if b.URIAttrs[lowerName] {
|
||||||
|
absURL, err := b.ResolveURL(attr.Value)
|
||||||
|
if err != nil {
|
||||||
|
return err
|
||||||
|
}
|
||||||
|
p.Attrs[i].Value = absURL
|
||||||
|
}
|
||||||
|
}
|
||||||
|
return nil
|
||||||
|
}
|
||||||
|
|
||||||
|
// Transforms html by resolving any relative URIs in attributes
|
||||||
|
// if an error occurs during parsing or serialization, then the original string
|
||||||
|
// is returned along with the error.
|
||||||
|
func (b *XMLBase) ResolveHTML(relHTML string) (string, error) {
|
||||||
|
if b.CurrentBase() == "" {
|
||||||
|
return relHTML, nil
|
||||||
|
}
|
||||||
|
|
||||||
|
htmlReader := strings.NewReader(relHTML)
|
||||||
|
|
||||||
|
doc, err := html.Parse(htmlReader)
|
||||||
|
if err != nil {
|
||||||
|
return relHTML, err
|
||||||
|
}
|
||||||
|
|
||||||
|
var visit func(*html.Node)
|
||||||
|
|
||||||
|
// recursively traverse HTML resolving any relative URIs in attributes
|
||||||
|
visit = func(n *html.Node) {
|
||||||
|
if n.Type == html.ElementNode {
|
||||||
|
for i, a := range n.Attr {
|
||||||
|
if htmlURIAttrs[a.Key] {
|
||||||
|
absVal, err := b.ResolveURL(a.Val)
|
||||||
|
if err == nil {
|
||||||
|
n.Attr[i].Val = absVal
|
||||||
|
}
|
||||||
|
break
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
for c := n.FirstChild; c != nil; c = c.NextSibling {
|
||||||
|
visit(c)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
visit(doc)
|
||||||
|
var w bytes.Buffer
|
||||||
|
err = html.Render(&w, doc)
|
||||||
|
if err != nil {
|
||||||
|
return relHTML, err
|
||||||
|
}
|
||||||
|
|
||||||
|
// html.Render() always writes a complete html5 document, so strip the html
|
||||||
|
// and body tags
|
||||||
|
absHTML := w.String()
|
||||||
|
absHTML = strings.TrimPrefix(absHTML, "<html><head></head><body>")
|
||||||
|
absHTML = strings.TrimSuffix(absHTML, "</body></html>")
|
||||||
|
|
||||||
|
return absHTML, err
|
||||||
|
}
|
23
vendor/github.com/mmcdole/gofeed/internal/shared/xmlsanitizer.go
generated
vendored
Normal file
23
vendor/github.com/mmcdole/gofeed/internal/shared/xmlsanitizer.go
generated
vendored
Normal file
|
@ -0,0 +1,23 @@
|
||||||
|
package shared
|
||||||
|
|
||||||
|
import (
|
||||||
|
"io"
|
||||||
|
|
||||||
|
"golang.org/x/text/transform"
|
||||||
|
)
|
||||||
|
|
||||||
|
// NewXMLSanitizerReader creates an io.Reader that
|
||||||
|
// wraps another io.Reader and removes illegal xml
|
||||||
|
// characters from the io stream.
|
||||||
|
func NewXMLSanitizerReader(xml io.Reader) io.Reader {
|
||||||
|
isIllegal := func(r rune) bool {
|
||||||
|
return !(r == 0x09 ||
|
||||||
|
r == 0x0A ||
|
||||||
|
r == 0x0D ||
|
||||||
|
r >= 0x20 && r <= 0xDF77 ||
|
||||||
|
r >= 0xE000 && r <= 0xFFFD ||
|
||||||
|
r >= 0x10000 && r <= 0x10FFFF)
|
||||||
|
}
|
||||||
|
t := transform.Chain(transform.RemoveFunc(isIllegal))
|
||||||
|
return transform.NewReader(xml, t)
|
||||||
|
}
|
156
vendor/github.com/mmcdole/gofeed/parser.go
generated
vendored
Normal file
156
vendor/github.com/mmcdole/gofeed/parser.go
generated
vendored
Normal file
|
@ -0,0 +1,156 @@
|
||||||
|
package gofeed
|
||||||
|
|
||||||
|
import (
|
||||||
|
"bytes"
|
||||||
|
"errors"
|
||||||
|
"fmt"
|
||||||
|
"io"
|
||||||
|
"net/http"
|
||||||
|
"strings"
|
||||||
|
|
||||||
|
"github.com/mmcdole/gofeed/atom"
|
||||||
|
"github.com/mmcdole/gofeed/rss"
|
||||||
|
)
|
||||||
|
|
||||||
|
// ErrFeedTypeNotDetected is returned when the detection system can not figure
|
||||||
|
// out the Feed format
|
||||||
|
var ErrFeedTypeNotDetected = errors.New("Failed to detect feed type")
|
||||||
|
|
||||||
|
// HTTPError represents an HTTP error returned by a server.
|
||||||
|
type HTTPError struct {
|
||||||
|
StatusCode int
|
||||||
|
Status string
|
||||||
|
}
|
||||||
|
|
||||||
|
func (err HTTPError) Error() string {
|
||||||
|
return fmt.Sprintf("http error: %s", err.Status)
|
||||||
|
}
|
||||||
|
|
||||||
|
// Parser is a universal feed parser that detects
|
||||||
|
// a given feed type, parsers it, and translates it
|
||||||
|
// to the universal feed type.
|
||||||
|
type Parser struct {
|
||||||
|
AtomTranslator Translator
|
||||||
|
RSSTranslator Translator
|
||||||
|
Client *http.Client
|
||||||
|
rp *rss.Parser
|
||||||
|
ap *atom.Parser
|
||||||
|
}
|
||||||
|
|
||||||
|
// NewParser creates a universal feed parser.
|
||||||
|
func NewParser() *Parser {
|
||||||
|
fp := Parser{
|
||||||
|
rp: &rss.Parser{},
|
||||||
|
ap: &atom.Parser{},
|
||||||
|
}
|
||||||
|
return &fp
|
||||||
|
}
|
||||||
|
|
||||||
|
// Parse parses a RSS or Atom feed into
|
||||||
|
// the universal gofeed.Feed. It takes an
|
||||||
|
// io.Reader which should return the xml content.
|
||||||
|
func (f *Parser) Parse(feed io.Reader) (*Feed, error) {
|
||||||
|
// Wrap the feed io.Reader in a io.TeeReader
|
||||||
|
// so we can capture all the bytes read by the
|
||||||
|
// DetectFeedType function and construct a new
|
||||||
|
// reader with those bytes intact for when we
|
||||||
|
// attempt to parse the feeds.
|
||||||
|
var buf bytes.Buffer
|
||||||
|
tee := io.TeeReader(feed, &buf)
|
||||||
|
feedType := DetectFeedType(tee)
|
||||||
|
|
||||||
|
// Glue the read bytes from the detect function
|
||||||
|
// back into a new reader
|
||||||
|
r := io.MultiReader(&buf, feed)
|
||||||
|
|
||||||
|
switch feedType {
|
||||||
|
case FeedTypeAtom:
|
||||||
|
return f.parseAtomFeed(r)
|
||||||
|
case FeedTypeRSS:
|
||||||
|
return f.parseRSSFeed(r)
|
||||||
|
}
|
||||||
|
|
||||||
|
return nil, ErrFeedTypeNotDetected
|
||||||
|
}
|
||||||
|
|
||||||
|
// ParseURL fetches the contents of a given url and
|
||||||
|
// attempts to parse the response into the universal feed type.
|
||||||
|
func (f *Parser) ParseURL(feedURL string) (feed *Feed, err error) {
|
||||||
|
client := f.httpClient()
|
||||||
|
|
||||||
|
req, err := http.NewRequest("GET", feedURL, nil)
|
||||||
|
if err != nil {
|
||||||
|
return nil, err
|
||||||
|
}
|
||||||
|
req.Header.Set("User-Agent", "Gofeed/1.0")
|
||||||
|
resp, err := client.Do(req)
|
||||||
|
|
||||||
|
if err != nil {
|
||||||
|
return nil, err
|
||||||
|
}
|
||||||
|
|
||||||
|
if resp != nil {
|
||||||
|
defer func() {
|
||||||
|
ce := resp.Body.Close()
|
||||||
|
if ce != nil {
|
||||||
|
err = ce
|
||||||
|
}
|
||||||
|
}()
|
||||||
|
}
|
||||||
|
|
||||||
|
if resp.StatusCode < 200 || resp.StatusCode >= 300 {
|
||||||
|
return nil, HTTPError{
|
||||||
|
StatusCode: resp.StatusCode,
|
||||||
|
Status: resp.Status,
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
return f.Parse(resp.Body)
|
||||||
|
}
|
||||||
|
|
||||||
|
// ParseString parses a feed XML string and into the
|
||||||
|
// universal feed type.
|
||||||
|
func (f *Parser) ParseString(feed string) (*Feed, error) {
|
||||||
|
return f.Parse(strings.NewReader(feed))
|
||||||
|
}
|
||||||
|
|
||||||
|
func (f *Parser) parseAtomFeed(feed io.Reader) (*Feed, error) {
|
||||||
|
af, err := f.ap.Parse(feed)
|
||||||
|
if err != nil {
|
||||||
|
return nil, err
|
||||||
|
}
|
||||||
|
return f.atomTrans().Translate(af)
|
||||||
|
}
|
||||||
|
|
||||||
|
func (f *Parser) parseRSSFeed(feed io.Reader) (*Feed, error) {
|
||||||
|
rf, err := f.rp.Parse(feed)
|
||||||
|
if err != nil {
|
||||||
|
return nil, err
|
||||||
|
}
|
||||||
|
|
||||||
|
return f.rssTrans().Translate(rf)
|
||||||
|
}
|
||||||
|
|
||||||
|
func (f *Parser) atomTrans() Translator {
|
||||||
|
if f.AtomTranslator != nil {
|
||||||
|
return f.AtomTranslator
|
||||||
|
}
|
||||||
|
f.AtomTranslator = &DefaultAtomTranslator{}
|
||||||
|
return f.AtomTranslator
|
||||||
|
}
|
||||||
|
|
||||||
|
func (f *Parser) rssTrans() Translator {
|
||||||
|
if f.RSSTranslator != nil {
|
||||||
|
return f.RSSTranslator
|
||||||
|
}
|
||||||
|
f.RSSTranslator = &DefaultRSSTranslator{}
|
||||||
|
return f.RSSTranslator
|
||||||
|
}
|
||||||
|
|
||||||
|
func (f *Parser) httpClient() *http.Client {
|
||||||
|
if f.Client != nil {
|
||||||
|
return f.Client
|
||||||
|
}
|
||||||
|
f.Client = &http.Client{}
|
||||||
|
return f.Client
|
||||||
|
}
|
120
vendor/github.com/mmcdole/gofeed/rss/feed.go
generated
vendored
Normal file
120
vendor/github.com/mmcdole/gofeed/rss/feed.go
generated
vendored
Normal file
|
@ -0,0 +1,120 @@
|
||||||
|
package rss
|
||||||
|
|
||||||
|
import (
|
||||||
|
"encoding/json"
|
||||||
|
"time"
|
||||||
|
|
||||||
|
"github.com/mmcdole/gofeed/extensions"
|
||||||
|
)
|
||||||
|
|
||||||
|
// Feed is an RSS Feed
|
||||||
|
type Feed struct {
|
||||||
|
Title string `json:"title,omitempty"`
|
||||||
|
Link string `json:"link,omitempty"`
|
||||||
|
Description string `json:"description,omitempty"`
|
||||||
|
Language string `json:"language,omitempty"`
|
||||||
|
Copyright string `json:"copyright,omitempty"`
|
||||||
|
ManagingEditor string `json:"managingEditor,omitempty"`
|
||||||
|
WebMaster string `json:"webMaster,omitempty"`
|
||||||
|
PubDate string `json:"pubDate,omitempty"`
|
||||||
|
PubDateParsed *time.Time `json:"pubDateParsed,omitempty"`
|
||||||
|
LastBuildDate string `json:"lastBuildDate,omitempty"`
|
||||||
|
LastBuildDateParsed *time.Time `json:"lastBuildDateParsed,omitempty"`
|
||||||
|
Categories []*Category `json:"categories,omitempty"`
|
||||||
|
Generator string `json:"generator,omitempty"`
|
||||||
|
Docs string `json:"docs,omitempty"`
|
||||||
|
TTL string `json:"ttl,omitempty"`
|
||||||
|
Image *Image `json:"image,omitempty"`
|
||||||
|
Rating string `json:"rating,omitempty"`
|
||||||
|
SkipHours []string `json:"skipHours,omitempty"`
|
||||||
|
SkipDays []string `json:"skipDays,omitempty"`
|
||||||
|
Cloud *Cloud `json:"cloud,omitempty"`
|
||||||
|
TextInput *TextInput `json:"textInput,omitempty"`
|
||||||
|
DublinCoreExt *ext.DublinCoreExtension `json:"dcExt,omitempty"`
|
||||||
|
ITunesExt *ext.ITunesFeedExtension `json:"itunesExt,omitempty"`
|
||||||
|
Extensions ext.Extensions `json:"extensions,omitempty"`
|
||||||
|
Items []*Item `json:"items"`
|
||||||
|
Version string `json:"version"`
|
||||||
|
}
|
||||||
|
|
||||||
|
func (f Feed) String() string {
|
||||||
|
json, _ := json.MarshalIndent(f, "", " ")
|
||||||
|
return string(json)
|
||||||
|
}
|
||||||
|
|
||||||
|
// Item is an RSS Item
|
||||||
|
type Item struct {
|
||||||
|
Title string `json:"title,omitempty"`
|
||||||
|
Link string `json:"link,omitempty"`
|
||||||
|
Description string `json:"description,omitempty"`
|
||||||
|
Content string `json:"content,omitempty"`
|
||||||
|
Author string `json:"author,omitempty"`
|
||||||
|
Categories []*Category `json:"categories,omitempty"`
|
||||||
|
Comments string `json:"comments,omitempty"`
|
||||||
|
Enclosure *Enclosure `json:"enclosure,omitempty"`
|
||||||
|
GUID *GUID `json:"guid,omitempty"`
|
||||||
|
PubDate string `json:"pubDate,omitempty"`
|
||||||
|
PubDateParsed *time.Time `json:"pubDateParsed,omitempty"`
|
||||||
|
Source *Source `json:"source,omitempty"`
|
||||||
|
DublinCoreExt *ext.DublinCoreExtension `json:"dcExt,omitempty"`
|
||||||
|
ITunesExt *ext.ITunesItemExtension `json:"itunesExt,omitempty"`
|
||||||
|
Extensions ext.Extensions `json:"extensions,omitempty"`
|
||||||
|
}
|
||||||
|
|
||||||
|
// Image is an image that represents the feed
|
||||||
|
type Image struct {
|
||||||
|
URL string `json:"url,omitempty"`
|
||||||
|
Link string `json:"link,omitempty"`
|
||||||
|
Title string `json:"title,omitempty"`
|
||||||
|
Width string `json:"width,omitempty"`
|
||||||
|
Height string `json:"height,omitempty"`
|
||||||
|
Description string `json:"description,omitempty"`
|
||||||
|
}
|
||||||
|
|
||||||
|
// Enclosure is a media object that is attached to
|
||||||
|
// the item
|
||||||
|
type Enclosure struct {
|
||||||
|
URL string `json:"url,omitempty"`
|
||||||
|
Length string `json:"length,omitempty"`
|
||||||
|
Type string `json:"type,omitempty"`
|
||||||
|
}
|
||||||
|
|
||||||
|
// GUID is a unique identifier for an item
|
||||||
|
type GUID struct {
|
||||||
|
Value string `json:"value,omitempty"`
|
||||||
|
IsPermalink string `json:"isPermalink,omitempty"`
|
||||||
|
}
|
||||||
|
|
||||||
|
// Source contains feed information for another
|
||||||
|
// feed if a given item came from that feed
|
||||||
|
type Source struct {
|
||||||
|
Title string `json:"title,omitempty"`
|
||||||
|
URL string `json:"url,omitempty"`
|
||||||
|
}
|
||||||
|
|
||||||
|
// Category is category metadata for Feeds and Entries
|
||||||
|
type Category struct {
|
||||||
|
Domain string `json:"domain,omitempty"`
|
||||||
|
Value string `json:"value,omitempty"`
|
||||||
|
}
|
||||||
|
|
||||||
|
// TextInput specifies a text input box that
|
||||||
|
// can be displayed with the channel
|
||||||
|
type TextInput struct {
|
||||||
|
Title string `json:"title,omitempty"`
|
||||||
|
Description string `json:"description,omitempty"`
|
||||||
|
Name string `json:"name,omitempty"`
|
||||||
|
Link string `json:"link,omitempty"`
|
||||||
|
}
|
||||||
|
|
||||||
|
// Cloud allows processes to register with a
|
||||||
|
// cloud to be notified of updates to the channel,
|
||||||
|
// implementing a lightweight publish-subscribe protocol
|
||||||
|
// for RSS feeds
|
||||||
|
type Cloud struct {
|
||||||
|
Domain string `json:"domain,omitempty"`
|
||||||
|
Port string `json:"port,omitempty"`
|
||||||
|
Path string `json:"path,omitempty"`
|
||||||
|
RegisterProcedure string `json:"registerProcedure,omitempty"`
|
||||||
|
Protocol string `json:"protocol,omitempty"`
|
||||||
|
}
|
21
vendor/github.com/mmcdole/goxpp/LICENSE
generated
vendored
Normal file
21
vendor/github.com/mmcdole/goxpp/LICENSE
generated
vendored
Normal file
|
@ -0,0 +1,21 @@
|
||||||
|
The MIT License (MIT)
|
||||||
|
|
||||||
|
Copyright (c) 2016 mmcdole
|
||||||
|
|
||||||
|
Permission is hereby granted, free of charge, to any person obtaining a copy
|
||||||
|
of this software and associated documentation files (the "Software"), to deal
|
||||||
|
in the Software without restriction, including without limitation the rights
|
||||||
|
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
|
||||||
|
copies of the Software, and to permit persons to whom the Software is
|
||||||
|
furnished to do so, subject to the following conditions:
|
||||||
|
|
||||||
|
The above copyright notice and this permission notice shall be included in all
|
||||||
|
copies or substantial portions of the Software.
|
||||||
|
|
||||||
|
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
|
||||||
|
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
|
||||||
|
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
|
||||||
|
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
|
||||||
|
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
|
||||||
|
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
|
||||||
|
SOFTWARE.
|
9
vendor/github.com/mmcdole/goxpp/README.md
generated
vendored
Normal file
9
vendor/github.com/mmcdole/goxpp/README.md
generated
vendored
Normal file
|
@ -0,0 +1,9 @@
|
||||||
|
# goxpp
|
||||||
|
|
||||||
|
[![Build Status](https://travis-ci.org/mmcdole/goxpp.svg?branch=master)](https://travis-ci.org/mmcdole/goxpp) [![Coverage Status](https://coveralls.io/repos/github/mmcdole/goxpp/badge.svg?branch=master)](https://coveralls.io/github/mmcdole/goxpp?branch=master) [![License](http://img.shields.io/:license-mit-blue.svg)](http://doge.mit-license.org)
|
||||||
|
[![GoDoc](https://godoc.org/github.com/mmcdole/goxpp?status.svg)](https://godoc.org/github.com/mmcdole/goxpp)
|
||||||
|
|
||||||
|
The `goxpp` library is an XML parser library that is loosely based on the [Java XMLPullParser](http://www.xmlpull.org/v1/download/unpacked/doc/quick_intro.html). This library allows you to easily parse arbitary XML content using a pull parser. You can think of `goxpp` as a lightweight wrapper around Go's XML `Decoder` that provides a set of functions that make it easier to parse XML content than using the raw decoder itself.
|
||||||
|
|
||||||
|
This project is licensed under the [MIT License](https://raw.githubusercontent.com/mmcdole/goxpp/master/LICENSE)
|
||||||
|
|
342
vendor/github.com/mmcdole/goxpp/xpp.go
generated
vendored
Normal file
342
vendor/github.com/mmcdole/goxpp/xpp.go
generated
vendored
Normal file
|
@ -0,0 +1,342 @@
|
||||||
|
package xpp
|
||||||
|
|
||||||
|
import (
|
||||||
|
"encoding/xml"
|
||||||
|
"errors"
|
||||||
|
"fmt"
|
||||||
|
"io"
|
||||||
|
"strings"
|
||||||
|
)
|
||||||
|
|
||||||
|
type XMLEventType int
|
||||||
|
type CharsetReader func(charset string, input io.Reader) (io.Reader, error)
|
||||||
|
|
||||||
|
const (
|
||||||
|
StartDocument XMLEventType = iota
|
||||||
|
EndDocument
|
||||||
|
StartTag
|
||||||
|
EndTag
|
||||||
|
Text
|
||||||
|
Comment
|
||||||
|
ProcessingInstruction
|
||||||
|
Directive
|
||||||
|
IgnorableWhitespace // TODO: ?
|
||||||
|
// TODO: CDSECT ?
|
||||||
|
)
|
||||||
|
|
||||||
|
type XMLPullParser struct {
|
||||||
|
// Document State
|
||||||
|
Spaces map[string]string
|
||||||
|
SpacesStack []map[string]string
|
||||||
|
|
||||||
|
// Token State
|
||||||
|
Depth int
|
||||||
|
Event XMLEventType
|
||||||
|
Attrs []xml.Attr
|
||||||
|
Name string
|
||||||
|
Space string
|
||||||
|
Text string
|
||||||
|
|
||||||
|
decoder *xml.Decoder
|
||||||
|
token interface{}
|
||||||
|
}
|
||||||
|
|
||||||
|
func NewXMLPullParser(r io.Reader, strict bool, cr CharsetReader) *XMLPullParser {
|
||||||
|
d := xml.NewDecoder(r)
|
||||||
|
d.Strict = strict
|
||||||
|
d.CharsetReader = cr
|
||||||
|
return &XMLPullParser{
|
||||||
|
decoder: d,
|
||||||
|
Event: StartDocument,
|
||||||
|
Depth: 0,
|
||||||
|
Spaces: map[string]string{},
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
func (p *XMLPullParser) NextTag() (event XMLEventType, err error) {
|
||||||
|
t, err := p.Next()
|
||||||
|
if err != nil {
|
||||||
|
return event, err
|
||||||
|
}
|
||||||
|
|
||||||
|
for t == Text && p.IsWhitespace() {
|
||||||
|
t, err = p.Next()
|
||||||
|
if err != nil {
|
||||||
|
return event, err
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
if t != StartTag && t != EndTag {
|
||||||
|
return event, fmt.Errorf("Expected StartTag or EndTag but got %s at offset: %d", p.EventName(t), p.decoder.InputOffset())
|
||||||
|
}
|
||||||
|
|
||||||
|
return t, nil
|
||||||
|
}
|
||||||
|
|
||||||
|
func (p *XMLPullParser) Next() (event XMLEventType, err error) {
|
||||||
|
for {
|
||||||
|
event, err = p.NextToken()
|
||||||
|
if err != nil {
|
||||||
|
return event, err
|
||||||
|
}
|
||||||
|
|
||||||
|
// Return immediately after encountering a StartTag
|
||||||
|
// EndTag, Text, EndDocument
|
||||||
|
if event == StartTag ||
|
||||||
|
event == EndTag ||
|
||||||
|
event == EndDocument ||
|
||||||
|
event == Text {
|
||||||
|
return event, nil
|
||||||
|
}
|
||||||
|
|
||||||
|
// Skip Comment/Directive and ProcessingInstruction
|
||||||
|
if event == Comment ||
|
||||||
|
event == Directive ||
|
||||||
|
event == ProcessingInstruction {
|
||||||
|
continue
|
||||||
|
}
|
||||||
|
}
|
||||||
|
return event, nil
|
||||||
|
}
|
||||||
|
|
||||||
|
func (p *XMLPullParser) NextToken() (event XMLEventType, err error) {
|
||||||
|
// Clear any state held for the previous token
|
||||||
|
p.resetTokenState()
|
||||||
|
|
||||||
|
token, err := p.decoder.Token()
|
||||||
|
if err != nil {
|
||||||
|
if err == io.EOF {
|
||||||
|
// XML decoder returns the EOF as an error
|
||||||
|
// but we want to return it as a valid
|
||||||
|
// EndDocument token instead
|
||||||
|
p.token = nil
|
||||||
|
p.Event = EndDocument
|
||||||
|
return p.Event, nil
|
||||||
|
}
|
||||||
|
return event, err
|
||||||
|
}
|
||||||
|
|
||||||
|
p.token = xml.CopyToken(token)
|
||||||
|
p.processToken(p.token)
|
||||||
|
p.Event = p.EventType(p.token)
|
||||||
|
|
||||||
|
return p.Event, nil
|
||||||
|
}
|
||||||
|
|
||||||
|
func (p *XMLPullParser) NextText() (string, error) {
|
||||||
|
if p.Event != StartTag {
|
||||||
|
return "", errors.New("Parser must be on StartTag to get NextText()")
|
||||||
|
}
|
||||||
|
|
||||||
|
t, err := p.Next()
|
||||||
|
if err != nil {
|
||||||
|
return "", err
|
||||||
|
}
|
||||||
|
|
||||||
|
if t != EndTag && t != Text {
|
||||||
|
return "", errors.New("Parser must be on EndTag or Text to read text")
|
||||||
|
}
|
||||||
|
|
||||||
|
var result string
|
||||||
|
for t == Text {
|
||||||
|
result = result + p.Text
|
||||||
|
t, err = p.Next()
|
||||||
|
if err != nil {
|
||||||
|
return "", err
|
||||||
|
}
|
||||||
|
|
||||||
|
if t != EndTag && t != Text {
|
||||||
|
errstr := fmt.Sprintf("Event Text must be immediately followed by EndTag or Text but got %s", p.EventName(t))
|
||||||
|
return "", errors.New(errstr)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
return result, nil
|
||||||
|
}
|
||||||
|
|
||||||
|
func (p *XMLPullParser) Skip() error {
|
||||||
|
for {
|
||||||
|
tok, err := p.NextToken()
|
||||||
|
if err != nil {
|
||||||
|
return err
|
||||||
|
}
|
||||||
|
if tok == StartTag {
|
||||||
|
if err := p.Skip(); err != nil {
|
||||||
|
return err
|
||||||
|
}
|
||||||
|
} else if tok == EndTag {
|
||||||
|
return nil
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
func (p *XMLPullParser) Attribute(name string) string {
|
||||||
|
for _, attr := range p.Attrs {
|
||||||
|
if attr.Name.Local == name {
|
||||||
|
return attr.Value
|
||||||
|
}
|
||||||
|
}
|
||||||
|
return ""
|
||||||
|
}
|
||||||
|
|
||||||
|
func (p *XMLPullParser) Expect(event XMLEventType, name string) (err error) {
|
||||||
|
return p.ExpectAll(event, "*", name)
|
||||||
|
}
|
||||||
|
|
||||||
|
func (p *XMLPullParser) ExpectAll(event XMLEventType, space string, name string) (err error) {
|
||||||
|
if !(p.Event == event && (strings.ToLower(p.Space) == strings.ToLower(space) || space == "*") && (strings.ToLower(p.Name) == strings.ToLower(name) || name == "*")) {
|
||||||
|
err = fmt.Errorf("Expected Space:%s Name:%s Event:%s but got Space:%s Name:%s Event:%s at offset: %d", space, name, p.EventName(event), p.Space, p.Name, p.EventName(p.Event), p.decoder.InputOffset())
|
||||||
|
}
|
||||||
|
return
|
||||||
|
}
|
||||||
|
|
||||||
|
func (p *XMLPullParser) DecodeElement(v interface{}) error {
|
||||||
|
if p.Event != StartTag {
|
||||||
|
return errors.New("DecodeElement can only be called from a StartTag event")
|
||||||
|
}
|
||||||
|
|
||||||
|
//tok := &p.token
|
||||||
|
|
||||||
|
startToken := p.token.(xml.StartElement)
|
||||||
|
|
||||||
|
// Consumes all tokens until the matching end token.
|
||||||
|
err := p.decoder.DecodeElement(v, &startToken)
|
||||||
|
if err != nil {
|
||||||
|
return err
|
||||||
|
}
|
||||||
|
|
||||||
|
name := p.Name
|
||||||
|
|
||||||
|
// Need to set the "current" token name/event
|
||||||
|
// to the previous StartTag event's name
|
||||||
|
p.resetTokenState()
|
||||||
|
p.Event = EndTag
|
||||||
|
p.Depth--
|
||||||
|
p.Name = name
|
||||||
|
p.token = nil
|
||||||
|
return nil
|
||||||
|
}
|
||||||
|
|
||||||
|
func (p *XMLPullParser) IsWhitespace() bool {
|
||||||
|
return strings.TrimSpace(p.Text) == ""
|
||||||
|
}
|
||||||
|
|
||||||
|
func (p *XMLPullParser) EventName(e XMLEventType) (name string) {
|
||||||
|
switch e {
|
||||||
|
case StartTag:
|
||||||
|
name = "StartTag"
|
||||||
|
case EndTag:
|
||||||
|
name = "EndTag"
|
||||||
|
case StartDocument:
|
||||||
|
name = "StartDocument"
|
||||||
|
case EndDocument:
|
||||||
|
name = "EndDocument"
|
||||||
|
case ProcessingInstruction:
|
||||||
|
name = "ProcessingInstruction"
|
||||||
|
case Directive:
|
||||||
|
name = "Directive"
|
||||||
|
case Comment:
|
||||||
|
name = "Comment"
|
||||||
|
case Text:
|
||||||
|
name = "Text"
|
||||||
|
case IgnorableWhitespace:
|
||||||
|
name = "IgnorableWhitespace"
|
||||||
|
}
|
||||||
|
return
|
||||||
|
}
|
||||||
|
|
||||||
|
func (p *XMLPullParser) EventType(t xml.Token) (event XMLEventType) {
|
||||||
|
switch t.(type) {
|
||||||
|
case xml.StartElement:
|
||||||
|
event = StartTag
|
||||||
|
case xml.EndElement:
|
||||||
|
event = EndTag
|
||||||
|
case xml.CharData:
|
||||||
|
event = Text
|
||||||
|
case xml.Comment:
|
||||||
|
event = Comment
|
||||||
|
case xml.ProcInst:
|
||||||
|
event = ProcessingInstruction
|
||||||
|
case xml.Directive:
|
||||||
|
event = Directive
|
||||||
|
}
|
||||||
|
return
|
||||||
|
}
|
||||||
|
|
||||||
|
func (p *XMLPullParser) processToken(t xml.Token) {
|
||||||
|
switch tt := t.(type) {
|
||||||
|
case xml.StartElement:
|
||||||
|
p.processStartToken(tt)
|
||||||
|
case xml.EndElement:
|
||||||
|
p.processEndToken(tt)
|
||||||
|
case xml.CharData:
|
||||||
|
p.processCharDataToken(tt)
|
||||||
|
case xml.Comment:
|
||||||
|
p.processCommentToken(tt)
|
||||||
|
case xml.ProcInst:
|
||||||
|
p.processProcInstToken(tt)
|
||||||
|
case xml.Directive:
|
||||||
|
p.processDirectiveToken(tt)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
func (p *XMLPullParser) processStartToken(t xml.StartElement) {
|
||||||
|
p.Depth++
|
||||||
|
p.Attrs = t.Attr
|
||||||
|
p.Name = t.Name.Local
|
||||||
|
p.Space = t.Name.Space
|
||||||
|
p.trackNamespaces(t)
|
||||||
|
}
|
||||||
|
|
||||||
|
func (p *XMLPullParser) processEndToken(t xml.EndElement) {
|
||||||
|
p.Depth--
|
||||||
|
p.SpacesStack = p.SpacesStack[:len(p.SpacesStack)-1]
|
||||||
|
if len(p.SpacesStack) == 0 {
|
||||||
|
p.Spaces = map[string]string{}
|
||||||
|
} else {
|
||||||
|
p.Spaces = p.SpacesStack[len(p.SpacesStack)-1]
|
||||||
|
}
|
||||||
|
p.Name = t.Name.Local
|
||||||
|
}
|
||||||
|
|
||||||
|
func (p *XMLPullParser) processCharDataToken(t xml.CharData) {
|
||||||
|
p.Text = string([]byte(t))
|
||||||
|
}
|
||||||
|
|
||||||
|
func (p *XMLPullParser) processCommentToken(t xml.Comment) {
|
||||||
|
p.Text = string([]byte(t))
|
||||||
|
}
|
||||||
|
|
||||||
|
func (p *XMLPullParser) processProcInstToken(t xml.ProcInst) {
|
||||||
|
p.Text = fmt.Sprintf("%s %s", t.Target, string(t.Inst))
|
||||||
|
}
|
||||||
|
|
||||||
|
func (p *XMLPullParser) processDirectiveToken(t xml.Directive) {
|
||||||
|
p.Text = string([]byte(t))
|
||||||
|
}
|
||||||
|
|
||||||
|
func (p *XMLPullParser) resetTokenState() {
|
||||||
|
p.Attrs = nil
|
||||||
|
p.Name = ""
|
||||||
|
p.Space = ""
|
||||||
|
p.Text = ""
|
||||||
|
}
|
||||||
|
|
||||||
|
func (p *XMLPullParser) trackNamespaces(t xml.StartElement) {
|
||||||
|
newSpace := map[string]string{}
|
||||||
|
for k, v := range p.Spaces {
|
||||||
|
newSpace[k] = v
|
||||||
|
}
|
||||||
|
for _, attr := range t.Attr {
|
||||||
|
if attr.Name.Space == "xmlns" {
|
||||||
|
space := strings.TrimSpace(attr.Value)
|
||||||
|
spacePrefix := strings.TrimSpace(strings.ToLower(attr.Name.Local))
|
||||||
|
newSpace[space] = spacePrefix
|
||||||
|
} else if attr.Name.Local == "xmlns" {
|
||||||
|
space := strings.TrimSpace(attr.Value)
|
||||||
|
newSpace[space] = ""
|
||||||
|
}
|
||||||
|
}
|
||||||
|
p.Spaces = newSpace
|
||||||
|
p.SpacesStack = append(p.SpacesStack, newSpace)
|
||||||
|
}
|
19
vendor/github.com/olekukonko/tablewriter/LICENSE.md
generated
vendored
Normal file
19
vendor/github.com/olekukonko/tablewriter/LICENSE.md
generated
vendored
Normal file
|
@ -0,0 +1,19 @@
|
||||||
|
Copyright (C) 2014 by Oleku Konko
|
||||||
|
|
||||||
|
Permission is hereby granted, free of charge, to any person obtaining a copy
|
||||||
|
of this software and associated documentation files (the "Software"), to deal
|
||||||
|
in the Software without restriction, including without limitation the rights
|
||||||
|
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
|
||||||
|
copies of the Software, and to permit persons to whom the Software is
|
||||||
|
furnished to do so, subject to the following conditions:
|
||||||
|
|
||||||
|
The above copyright notice and this permission notice shall be included in
|
||||||
|
all copies or substantial portions of the Software.
|
||||||
|
|
||||||
|
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
|
||||||
|
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
|
||||||
|
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
|
||||||
|
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
|
||||||
|
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
|
||||||
|
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN
|
||||||
|
THE SOFTWARE.
|
52
vendor/github.com/olekukonko/tablewriter/csv.go
generated
vendored
Normal file
52
vendor/github.com/olekukonko/tablewriter/csv.go
generated
vendored
Normal file
|
@ -0,0 +1,52 @@
|
||||||
|
// Copyright 2014 Oleku Konko All rights reserved.
|
||||||
|
// Use of this source code is governed by a MIT
|
||||||
|
// license that can be found in the LICENSE file.
|
||||||
|
|
||||||
|
// This module is a Table Writer API for the Go Programming Language.
|
||||||
|
// The protocols were written in pure Go and works on windows and unix systems
|
||||||
|
|
||||||
|
package tablewriter
|
||||||
|
|
||||||
|
import (
|
||||||
|
"encoding/csv"
|
||||||
|
"io"
|
||||||
|
"os"
|
||||||
|
)
|
||||||
|
|
||||||
|
// Start A new table by importing from a CSV file
|
||||||
|
// Takes io.Writer and csv File name
|
||||||
|
func NewCSV(writer io.Writer, fileName string, hasHeader bool) (*Table, error) {
|
||||||
|
file, err := os.Open(fileName)
|
||||||
|
if err != nil {
|
||||||
|
return &Table{}, err
|
||||||
|
}
|
||||||
|
defer file.Close()
|
||||||
|
csvReader := csv.NewReader(file)
|
||||||
|
t, err := NewCSVReader(writer, csvReader, hasHeader)
|
||||||
|
return t, err
|
||||||
|
}
|
||||||
|
|
||||||
|
// Start a New Table Writer with csv.Reader
|
||||||
|
// This enables customisation such as reader.Comma = ';'
|
||||||
|
// See http://golang.org/src/pkg/encoding/csv/reader.go?s=3213:3671#L94
|
||||||
|
func NewCSVReader(writer io.Writer, csvReader *csv.Reader, hasHeader bool) (*Table, error) {
|
||||||
|
t := NewWriter(writer)
|
||||||
|
if hasHeader {
|
||||||
|
// Read the first row
|
||||||
|
headers, err := csvReader.Read()
|
||||||
|
if err != nil {
|
||||||
|
return &Table{}, err
|
||||||
|
}
|
||||||
|
t.SetHeader(headers)
|
||||||
|
}
|
||||||
|
for {
|
||||||
|
record, err := csvReader.Read()
|
||||||
|
if err == io.EOF {
|
||||||
|
break
|
||||||
|
} else if err != nil {
|
||||||
|
return &Table{}, err
|
||||||
|
}
|
||||||
|
t.Append(record)
|
||||||
|
}
|
||||||
|
return t, nil
|
||||||
|
}
|
134
vendor/github.com/olekukonko/tablewriter/table_with_color.go
generated
vendored
Normal file
134
vendor/github.com/olekukonko/tablewriter/table_with_color.go
generated
vendored
Normal file
|
@ -0,0 +1,134 @@
|
||||||
|
package tablewriter
|
||||||
|
|
||||||
|
import (
|
||||||
|
"fmt"
|
||||||
|
"strconv"
|
||||||
|
"strings"
|
||||||
|
)
|
||||||
|
|
||||||
|
const ESC = "\033"
|
||||||
|
const SEP = ";"
|
||||||
|
|
||||||
|
const (
|
||||||
|
BgBlackColor int = iota + 40
|
||||||
|
BgRedColor
|
||||||
|
BgGreenColor
|
||||||
|
BgYellowColor
|
||||||
|
BgBlueColor
|
||||||
|
BgMagentaColor
|
||||||
|
BgCyanColor
|
||||||
|
BgWhiteColor
|
||||||
|
)
|
||||||
|
|
||||||
|
const (
|
||||||
|
FgBlackColor int = iota + 30
|
||||||
|
FgRedColor
|
||||||
|
FgGreenColor
|
||||||
|
FgYellowColor
|
||||||
|
FgBlueColor
|
||||||
|
FgMagentaColor
|
||||||
|
FgCyanColor
|
||||||
|
FgWhiteColor
|
||||||
|
)
|
||||||
|
|
||||||
|
const (
|
||||||
|
BgHiBlackColor int = iota + 100
|
||||||
|
BgHiRedColor
|
||||||
|
BgHiGreenColor
|
||||||
|
BgHiYellowColor
|
||||||
|
BgHiBlueColor
|
||||||
|
BgHiMagentaColor
|
||||||
|
BgHiCyanColor
|
||||||
|
BgHiWhiteColor
|
||||||
|
)
|
||||||
|
|
||||||
|
const (
|
||||||
|
FgHiBlackColor int = iota + 90
|
||||||
|
FgHiRedColor
|
||||||
|
FgHiGreenColor
|
||||||
|
FgHiYellowColor
|
||||||
|
FgHiBlueColor
|
||||||
|
FgHiMagentaColor
|
||||||
|
FgHiCyanColor
|
||||||
|
FgHiWhiteColor
|
||||||
|
)
|
||||||
|
|
||||||
|
const (
|
||||||
|
Normal = 0
|
||||||
|
Bold = 1
|
||||||
|
UnderlineSingle = 4
|
||||||
|
Italic
|
||||||
|
)
|
||||||
|
|
||||||
|
type Colors []int
|
||||||
|
|
||||||
|
func startFormat(seq string) string {
|
||||||
|
return fmt.Sprintf("%s[%sm", ESC, seq)
|
||||||
|
}
|
||||||
|
|
||||||
|
func stopFormat() string {
|
||||||
|
return fmt.Sprintf("%s[%dm", ESC, Normal)
|
||||||
|
}
|
||||||
|
|
||||||
|
// Making the SGR (Select Graphic Rendition) sequence.
|
||||||
|
func makeSequence(codes []int) string {
|
||||||
|
codesInString := []string{}
|
||||||
|
for _, code := range codes {
|
||||||
|
codesInString = append(codesInString, strconv.Itoa(code))
|
||||||
|
}
|
||||||
|
return strings.Join(codesInString, SEP)
|
||||||
|
}
|
||||||
|
|
||||||
|
// Adding ANSI escape sequences before and after string
|
||||||
|
func format(s string, codes interface{}) string {
|
||||||
|
var seq string
|
||||||
|
|
||||||
|
switch v := codes.(type) {
|
||||||
|
|
||||||
|
case string:
|
||||||
|
seq = v
|
||||||
|
case []int:
|
||||||
|
seq = makeSequence(v)
|
||||||
|
default:
|
||||||
|
return s
|
||||||
|
}
|
||||||
|
|
||||||
|
if len(seq) == 0 {
|
||||||
|
return s
|
||||||
|
}
|
||||||
|
return startFormat(seq) + s + stopFormat()
|
||||||
|
}
|
||||||
|
|
||||||
|
// Adding header colors (ANSI codes)
|
||||||
|
func (t *Table) SetHeaderColor(colors ...Colors) {
|
||||||
|
if t.colSize != len(colors) {
|
||||||
|
panic("Number of header colors must be equal to number of headers.")
|
||||||
|
}
|
||||||
|
for i := 0; i < len(colors); i++ {
|
||||||
|
t.headerParams = append(t.headerParams, makeSequence(colors[i]))
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// Adding column colors (ANSI codes)
|
||||||
|
func (t *Table) SetColumnColor(colors ...Colors) {
|
||||||
|
if t.colSize != len(colors) {
|
||||||
|
panic("Number of column colors must be equal to number of headers.")
|
||||||
|
}
|
||||||
|
for i := 0; i < len(colors); i++ {
|
||||||
|
t.columnsParams = append(t.columnsParams, makeSequence(colors[i]))
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// Adding column colors (ANSI codes)
|
||||||
|
func (t *Table) SetFooterColor(colors ...Colors) {
|
||||||
|
if len(t.footers) != len(colors) {
|
||||||
|
panic("Number of footer colors must be equal to number of footer.")
|
||||||
|
}
|
||||||
|
for i := 0; i < len(colors); i++ {
|
||||||
|
t.footerParams = append(t.footerParams, makeSequence(colors[i]))
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
func Color(colors ...int) []int {
|
||||||
|
return colors
|
||||||
|
}
|
99
vendor/github.com/olekukonko/tablewriter/wrap.go
generated
vendored
Normal file
99
vendor/github.com/olekukonko/tablewriter/wrap.go
generated
vendored
Normal file
|
@ -0,0 +1,99 @@
|
||||||
|
// Copyright 2014 Oleku Konko All rights reserved.
|
||||||
|
// Use of this source code is governed by a MIT
|
||||||
|
// license that can be found in the LICENSE file.
|
||||||
|
|
||||||
|
// This module is a Table Writer API for the Go Programming Language.
|
||||||
|
// The protocols were written in pure Go and works on windows and unix systems
|
||||||
|
|
||||||
|
package tablewriter
|
||||||
|
|
||||||
|
import (
|
||||||
|
"math"
|
||||||
|
"strings"
|
||||||
|
|
||||||
|
"github.com/mattn/go-runewidth"
|
||||||
|
)
|
||||||
|
|
||||||
|
var (
|
||||||
|
nl = "\n"
|
||||||
|
sp = " "
|
||||||
|
)
|
||||||
|
|
||||||
|
const defaultPenalty = 1e5
|
||||||
|
|
||||||
|
// Wrap wraps s into a paragraph of lines of length lim, with minimal
|
||||||
|
// raggedness.
|
||||||
|
func WrapString(s string, lim int) ([]string, int) {
|
||||||
|
words := strings.Split(strings.Replace(s, nl, sp, -1), sp)
|
||||||
|
var lines []string
|
||||||
|
max := 0
|
||||||
|
for _, v := range words {
|
||||||
|
max = runewidth.StringWidth(v)
|
||||||
|
if max > lim {
|
||||||
|
lim = max
|
||||||
|
}
|
||||||
|
}
|
||||||
|
for _, line := range WrapWords(words, 1, lim, defaultPenalty) {
|
||||||
|
lines = append(lines, strings.Join(line, sp))
|
||||||
|
}
|
||||||
|
return lines, lim
|
||||||
|
}
|
||||||
|
|
||||||
|
// WrapWords is the low-level line-breaking algorithm, useful if you need more
|
||||||
|
// control over the details of the text wrapping process. For most uses,
|
||||||
|
// WrapString will be sufficient and more convenient.
|
||||||
|
//
|
||||||
|
// WrapWords splits a list of words into lines with minimal "raggedness",
|
||||||
|
// treating each rune as one unit, accounting for spc units between adjacent
|
||||||
|
// words on each line, and attempting to limit lines to lim units. Raggedness
|
||||||
|
// is the total error over all lines, where error is the square of the
|
||||||
|
// difference of the length of the line and lim. Too-long lines (which only
|
||||||
|
// happen when a single word is longer than lim units) have pen penalty units
|
||||||
|
// added to the error.
|
||||||
|
func WrapWords(words []string, spc, lim, pen int) [][]string {
|
||||||
|
n := len(words)
|
||||||
|
|
||||||
|
length := make([][]int, n)
|
||||||
|
for i := 0; i < n; i++ {
|
||||||
|
length[i] = make([]int, n)
|
||||||
|
length[i][i] = runewidth.StringWidth(words[i])
|
||||||
|
for j := i + 1; j < n; j++ {
|
||||||
|
length[i][j] = length[i][j-1] + spc + runewidth.StringWidth(words[j])
|
||||||
|
}
|
||||||
|
}
|
||||||
|
nbrk := make([]int, n)
|
||||||
|
cost := make([]int, n)
|
||||||
|
for i := range cost {
|
||||||
|
cost[i] = math.MaxInt32
|
||||||
|
}
|
||||||
|
for i := n - 1; i >= 0; i-- {
|
||||||
|
if length[i][n-1] <= lim {
|
||||||
|
cost[i] = 0
|
||||||
|
nbrk[i] = n
|
||||||
|
} else {
|
||||||
|
for j := i + 1; j < n; j++ {
|
||||||
|
d := lim - length[i][j-1]
|
||||||
|
c := d*d + cost[j]
|
||||||
|
if length[i][j-1] > lim {
|
||||||
|
c += pen // too-long lines get a worse penalty
|
||||||
|
}
|
||||||
|
if c < cost[i] {
|
||||||
|
cost[i] = c
|
||||||
|
nbrk[i] = j
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
var lines [][]string
|
||||||
|
i := 0
|
||||||
|
for i < n {
|
||||||
|
lines = append(lines, words[i:nbrk[i]])
|
||||||
|
i = nbrk[i]
|
||||||
|
}
|
||||||
|
return lines
|
||||||
|
}
|
||||||
|
|
||||||
|
// getLines decomposes a multiline string into a slice of strings.
|
||||||
|
func getLines(s string) []string {
|
||||||
|
return strings.Split(s, nl)
|
||||||
|
}
|
21
vendor/github.com/ssor/bom/LICENSE
generated
vendored
Normal file
21
vendor/github.com/ssor/bom/LICENSE
generated
vendored
Normal file
|
@ -0,0 +1,21 @@
|
||||||
|
MIT License
|
||||||
|
|
||||||
|
Copyright (c) 2017 Asher
|
||||||
|
|
||||||
|
Permission is hereby granted, free of charge, to any person obtaining a copy
|
||||||
|
of this software and associated documentation files (the "Software"), to deal
|
||||||
|
in the Software without restriction, including without limitation the rights
|
||||||
|
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
|
||||||
|
copies of the Software, and to permit persons to whom the Software is
|
||||||
|
furnished to do so, subject to the following conditions:
|
||||||
|
|
||||||
|
The above copyright notice and this permission notice shall be included in all
|
||||||
|
copies or substantial portions of the Software.
|
||||||
|
|
||||||
|
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
|
||||||
|
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
|
||||||
|
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
|
||||||
|
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
|
||||||
|
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
|
||||||
|
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
|
||||||
|
SOFTWARE.
|
23
vendor/github.com/ssor/bom/README.md
generated
vendored
Normal file
23
vendor/github.com/ssor/bom/README.md
generated
vendored
Normal file
|
@ -0,0 +1,23 @@
|
||||||
|
# bom
|
||||||
|
small tools for cleaning bom from byte array or reader
|
||||||
|
|
||||||
|
|
||||||
|
## Installation
|
||||||
|
|
||||||
|
```sh
|
||||||
|
$ go get github.com/ssor/bom
|
||||||
|
```
|
||||||
|
|
||||||
|
## How to Use
|
||||||
|
|
||||||
|
|
||||||
|
```
|
||||||
|
bs := []byte{bom0, bom1, bom2, 0x11}
|
||||||
|
result := CleanBom(bs)
|
||||||
|
```
|
||||||
|
|
||||||
|
```
|
||||||
|
bs := []byte{bom0, bom1, bom2, 0x11}
|
||||||
|
result := NewReaderWithoutBom(bytes.NewReader(bs))
|
||||||
|
|
||||||
|
```
|
34
vendor/github.com/ssor/bom/bom.go
generated
vendored
Normal file
34
vendor/github.com/ssor/bom/bom.go
generated
vendored
Normal file
|
@ -0,0 +1,34 @@
|
||||||
|
package bom
|
||||||
|
|
||||||
|
import (
|
||||||
|
"bytes"
|
||||||
|
"io"
|
||||||
|
"io/ioutil"
|
||||||
|
)
|
||||||
|
|
||||||
|
const (
|
||||||
|
bom0 = 0xef
|
||||||
|
bom1 = 0xbb
|
||||||
|
bom2 = 0xbf
|
||||||
|
)
|
||||||
|
|
||||||
|
// CleanBom returns b with the 3 byte BOM stripped off the front if it is present.
|
||||||
|
// If the BOM is not present, then b is returned.
|
||||||
|
func CleanBom(b []byte) []byte {
|
||||||
|
if len(b) >= 3 &&
|
||||||
|
b[0] == bom0 &&
|
||||||
|
b[1] == bom1 &&
|
||||||
|
b[2] == bom2 {
|
||||||
|
return b[3:]
|
||||||
|
}
|
||||||
|
return b
|
||||||
|
}
|
||||||
|
|
||||||
|
// NewReaderWithoutBom returns an io.Reader that will skip over initial UTF-8 byte order marks.
|
||||||
|
func NewReaderWithoutBom(r io.Reader) (io.Reader, error) {
|
||||||
|
bs, err := ioutil.ReadAll(r)
|
||||||
|
if err != nil {
|
||||||
|
return nil, err
|
||||||
|
}
|
||||||
|
return bytes.NewReader(CleanBom(bs)), nil
|
||||||
|
}
|
27
vendor/golang.org/x/net/LICENSE
generated
vendored
Normal file
27
vendor/golang.org/x/net/LICENSE
generated
vendored
Normal file
|
@ -0,0 +1,27 @@
|
||||||
|
Copyright (c) 2009 The Go Authors. All rights reserved.
|
||||||
|
|
||||||
|
Redistribution and use in source and binary forms, with or without
|
||||||
|
modification, are permitted provided that the following conditions are
|
||||||
|
met:
|
||||||
|
|
||||||
|
* Redistributions of source code must retain the above copyright
|
||||||
|
notice, this list of conditions and the following disclaimer.
|
||||||
|
* Redistributions in binary form must reproduce the above
|
||||||
|
copyright notice, this list of conditions and the following disclaimer
|
||||||
|
in the documentation and/or other materials provided with the
|
||||||
|
distribution.
|
||||||
|
* Neither the name of Google Inc. nor the names of its
|
||||||
|
contributors may be used to endorse or promote products derived from
|
||||||
|
this software without specific prior written permission.
|
||||||
|
|
||||||
|
THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
|
||||||
|
"AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT
|
||||||
|
LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR
|
||||||
|
A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT
|
||||||
|
OWNER OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL,
|
||||||
|
SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT
|
||||||
|
LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE,
|
||||||
|
DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY
|
||||||
|
THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT
|
||||||
|
(INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
|
||||||
|
OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
|
22
vendor/golang.org/x/net/PATENTS
generated
vendored
Normal file
22
vendor/golang.org/x/net/PATENTS
generated
vendored
Normal file
|
@ -0,0 +1,22 @@
|
||||||
|
Additional IP Rights Grant (Patents)
|
||||||
|
|
||||||
|
"This implementation" means the copyrightable works distributed by
|
||||||
|
Google as part of the Go project.
|
||||||
|
|
||||||
|
Google hereby grants to You a perpetual, worldwide, non-exclusive,
|
||||||
|
no-charge, royalty-free, irrevocable (except as stated in this section)
|
||||||
|
patent license to make, have made, use, offer to sell, sell, import,
|
||||||
|
transfer and otherwise run, modify and propagate the contents of this
|
||||||
|
implementation of Go, where such license applies only to those patent
|
||||||
|
claims, both currently owned or controlled by Google and acquired in
|
||||||
|
the future, licensable by Google that are necessarily infringed by this
|
||||||
|
implementation of Go. This grant does not include claims that would be
|
||||||
|
infringed only as a consequence of further modification of this
|
||||||
|
implementation. If you or your agent or exclusive licensee institute or
|
||||||
|
order or agree to the institution of patent litigation against any
|
||||||
|
entity (including a cross-claim or counterclaim in a lawsuit) alleging
|
||||||
|
that this implementation of Go or any code incorporated within this
|
||||||
|
implementation of Go constitutes direct or contributory patent
|
||||||
|
infringement, or inducement of patent infringement, then any patent
|
||||||
|
rights granted to you under this License for this implementation of Go
|
||||||
|
shall terminate as of the date such litigation is filed.
|
78
vendor/golang.org/x/net/html/atom/atom.go
generated
vendored
Normal file
78
vendor/golang.org/x/net/html/atom/atom.go
generated
vendored
Normal file
|
@ -0,0 +1,78 @@
|
||||||
|
// Copyright 2012 The Go Authors. All rights reserved.
|
||||||
|
// Use of this source code is governed by a BSD-style
|
||||||
|
// license that can be found in the LICENSE file.
|
||||||
|
|
||||||
|
// Package atom provides integer codes (also known as atoms) for a fixed set of
|
||||||
|
// frequently occurring HTML strings: tag names and attribute keys such as "p"
|
||||||
|
// and "id".
|
||||||
|
//
|
||||||
|
// Sharing an atom's name between all elements with the same tag can result in
|
||||||
|
// fewer string allocations when tokenizing and parsing HTML. Integer
|
||||||
|
// comparisons are also generally faster than string comparisons.
|
||||||
|
//
|
||||||
|
// The value of an atom's particular code is not guaranteed to stay the same
|
||||||
|
// between versions of this package. Neither is any ordering guaranteed:
|
||||||
|
// whether atom.H1 < atom.H2 may also change. The codes are not guaranteed to
|
||||||
|
// be dense. The only guarantees are that e.g. looking up "div" will yield
|
||||||
|
// atom.Div, calling atom.Div.String will return "div", and atom.Div != 0.
|
||||||
|
package atom // import "golang.org/x/net/html/atom"
|
||||||
|
|
||||||
|
// Atom is an integer code for a string. The zero value maps to "".
|
||||||
|
type Atom uint32
|
||||||
|
|
||||||
|
// String returns the atom's name.
|
||||||
|
func (a Atom) String() string {
|
||||||
|
start := uint32(a >> 8)
|
||||||
|
n := uint32(a & 0xff)
|
||||||
|
if start+n > uint32(len(atomText)) {
|
||||||
|
return ""
|
||||||
|
}
|
||||||
|
return atomText[start : start+n]
|
||||||
|
}
|
||||||
|
|
||||||
|
func (a Atom) string() string {
|
||||||
|
return atomText[a>>8 : a>>8+a&0xff]
|
||||||
|
}
|
||||||
|
|
||||||
|
// fnv computes the FNV hash with an arbitrary starting value h.
|
||||||
|
func fnv(h uint32, s []byte) uint32 {
|
||||||
|
for i := range s {
|
||||||
|
h ^= uint32(s[i])
|
||||||
|
h *= 16777619
|
||||||
|
}
|
||||||
|
return h
|
||||||
|
}
|
||||||
|
|
||||||
|
func match(s string, t []byte) bool {
|
||||||
|
for i, c := range t {
|
||||||
|
if s[i] != c {
|
||||||
|
return false
|
||||||
|
}
|
||||||
|
}
|
||||||
|
return true
|
||||||
|
}
|
||||||
|
|
||||||
|
// Lookup returns the atom whose name is s. It returns zero if there is no
|
||||||
|
// such atom. The lookup is case sensitive.
|
||||||
|
func Lookup(s []byte) Atom {
|
||||||
|
if len(s) == 0 || len(s) > maxAtomLen {
|
||||||
|
return 0
|
||||||
|
}
|
||||||
|
h := fnv(hash0, s)
|
||||||
|
if a := table[h&uint32(len(table)-1)]; int(a&0xff) == len(s) && match(a.string(), s) {
|
||||||
|
return a
|
||||||
|
}
|
||||||
|
if a := table[(h>>16)&uint32(len(table)-1)]; int(a&0xff) == len(s) && match(a.string(), s) {
|
||||||
|
return a
|
||||||
|
}
|
||||||
|
return 0
|
||||||
|
}
|
||||||
|
|
||||||
|
// String returns a string whose contents are equal to s. In that sense, it is
|
||||||
|
// equivalent to string(s) but may be more efficient.
|
||||||
|
func String(s []byte) string {
|
||||||
|
if a := Lookup(s); a != 0 {
|
||||||
|
return a.String()
|
||||||
|
}
|
||||||
|
return string(s)
|
||||||
|
}
|
783
vendor/golang.org/x/net/html/atom/table.go
generated
vendored
Normal file
783
vendor/golang.org/x/net/html/atom/table.go
generated
vendored
Normal file
|
@ -0,0 +1,783 @@
|
||||||
|
// Code generated by go generate gen.go; DO NOT EDIT.
|
||||||
|
|
||||||
|
//go:generate go run gen.go
|
||||||
|
|
||||||
|
package atom
|
||||||
|
|
||||||
|
const (
|
||||||
|
A Atom = 0x1
|
||||||
|
Abbr Atom = 0x4
|
||||||
|
Accept Atom = 0x1a06
|
||||||
|
AcceptCharset Atom = 0x1a0e
|
||||||
|
Accesskey Atom = 0x2c09
|
||||||
|
Acronym Atom = 0xaa07
|
||||||
|
Action Atom = 0x27206
|
||||||
|
Address Atom = 0x6f307
|
||||||
|
Align Atom = 0xb105
|
||||||
|
Allowfullscreen Atom = 0x2080f
|
||||||
|
Allowpaymentrequest Atom = 0xc113
|
||||||
|
Allowusermedia Atom = 0xdd0e
|
||||||
|
Alt Atom = 0xf303
|
||||||
|
Annotation Atom = 0x1c90a
|
||||||
|
AnnotationXml Atom = 0x1c90e
|
||||||
|
Applet Atom = 0x31906
|
||||||
|
Area Atom = 0x35604
|
||||||
|
Article Atom = 0x3fc07
|
||||||
|
As Atom = 0x3c02
|
||||||
|
Aside Atom = 0x10705
|
||||||
|
Async Atom = 0xff05
|
||||||
|
Audio Atom = 0x11505
|
||||||
|
Autocomplete Atom = 0x2780c
|
||||||
|
Autofocus Atom = 0x12109
|
||||||
|
Autoplay Atom = 0x13c08
|
||||||
|
B Atom = 0x101
|
||||||
|
Base Atom = 0x3b04
|
||||||
|
Basefont Atom = 0x3b08
|
||||||
|
Bdi Atom = 0xba03
|
||||||
|
Bdo Atom = 0x14b03
|
||||||
|
Bgsound Atom = 0x15e07
|
||||||
|
Big Atom = 0x17003
|
||||||
|
Blink Atom = 0x17305
|
||||||
|
Blockquote Atom = 0x1870a
|
||||||
|
Body Atom = 0x2804
|
||||||
|
Br Atom = 0x202
|
||||||
|
Button Atom = 0x19106
|
||||||
|
Canvas Atom = 0x10306
|
||||||
|
Caption Atom = 0x23107
|
||||||
|
Center Atom = 0x22006
|
||||||
|
Challenge Atom = 0x29b09
|
||||||
|
Charset Atom = 0x2107
|
||||||
|
Checked Atom = 0x47907
|
||||||
|
Cite Atom = 0x19c04
|
||||||
|
Class Atom = 0x56405
|
||||||
|
Code Atom = 0x5c504
|
||||||
|
Col Atom = 0x1ab03
|
||||||
|
Colgroup Atom = 0x1ab08
|
||||||
|
Color Atom = 0x1bf05
|
||||||
|
Cols Atom = 0x1c404
|
||||||
|
Colspan Atom = 0x1c407
|
||||||
|
Command Atom = 0x1d707
|
||||||
|
Content Atom = 0x58b07
|
||||||
|
Contenteditable Atom = 0x58b0f
|
||||||
|
Contextmenu Atom = 0x3800b
|
||||||
|
Controls Atom = 0x1de08
|
||||||
|
Coords Atom = 0x1ea06
|
||||||
|
Crossorigin Atom = 0x1fb0b
|
||||||
|
Data Atom = 0x4a504
|
||||||
|
Datalist Atom = 0x4a508
|
||||||
|
Datetime Atom = 0x2b808
|
||||||
|
Dd Atom = 0x2d702
|
||||||
|
Default Atom = 0x10a07
|
||||||
|
Defer Atom = 0x5c705
|
||||||
|
Del Atom = 0x45203
|
||||||
|
Desc Atom = 0x56104
|
||||||
|
Details Atom = 0x7207
|
||||||
|
Dfn Atom = 0x8703
|
||||||
|
Dialog Atom = 0xbb06
|
||||||
|
Dir Atom = 0x9303
|
||||||
|
Dirname Atom = 0x9307
|
||||||
|
Disabled Atom = 0x16408
|
||||||
|
Div Atom = 0x16b03
|
||||||
|
Dl Atom = 0x5e602
|
||||||
|
Download Atom = 0x46308
|
||||||
|
Draggable Atom = 0x17a09
|
||||||
|
Dropzone Atom = 0x40508
|
||||||
|
Dt Atom = 0x64b02
|
||||||
|
Em Atom = 0x6e02
|
||||||
|
Embed Atom = 0x6e05
|
||||||
|
Enctype Atom = 0x28d07
|
||||||
|
Face Atom = 0x21e04
|
||||||
|
Fieldset Atom = 0x22608
|
||||||
|
Figcaption Atom = 0x22e0a
|
||||||
|
Figure Atom = 0x24806
|
||||||
|
Font Atom = 0x3f04
|
||||||
|
Footer Atom = 0xf606
|
||||||
|
For Atom = 0x25403
|
||||||
|
ForeignObject Atom = 0x2540d
|
||||||
|
Foreignobject Atom = 0x2610d
|
||||||
|
Form Atom = 0x26e04
|
||||||
|
Formaction Atom = 0x26e0a
|
||||||
|
Formenctype Atom = 0x2890b
|
||||||
|
Formmethod Atom = 0x2a40a
|
||||||
|
Formnovalidate Atom = 0x2ae0e
|
||||||
|
Formtarget Atom = 0x2c00a
|
||||||
|
Frame Atom = 0x8b05
|
||||||
|
Frameset Atom = 0x8b08
|
||||||
|
H1 Atom = 0x15c02
|
||||||
|
H2 Atom = 0x2de02
|
||||||
|
H3 Atom = 0x30d02
|
||||||
|
H4 Atom = 0x34502
|
||||||
|
H5 Atom = 0x34f02
|
||||||
|
H6 Atom = 0x64d02
|
||||||
|
Head Atom = 0x33104
|
||||||
|
Header Atom = 0x33106
|
||||||
|
Headers Atom = 0x33107
|
||||||
|
Height Atom = 0x5206
|
||||||
|
Hgroup Atom = 0x2ca06
|
||||||
|
Hidden Atom = 0x2d506
|
||||||
|
High Atom = 0x2db04
|
||||||
|
Hr Atom = 0x15702
|
||||||
|
Href Atom = 0x2e004
|
||||||
|
Hreflang Atom = 0x2e008
|
||||||
|
Html Atom = 0x5604
|
||||||
|
HttpEquiv Atom = 0x2e80a
|
||||||
|
I Atom = 0x601
|
||||||
|
Icon Atom = 0x58a04
|
||||||
|
Id Atom = 0x10902
|
||||||
|
Iframe Atom = 0x2fc06
|
||||||
|
Image Atom = 0x30205
|
||||||
|
Img Atom = 0x30703
|
||||||
|
Input Atom = 0x44b05
|
||||||
|
Inputmode Atom = 0x44b09
|
||||||
|
Ins Atom = 0x20403
|
||||||
|
Integrity Atom = 0x23f09
|
||||||
|
Is Atom = 0x16502
|
||||||
|
Isindex Atom = 0x30f07
|
||||||
|
Ismap Atom = 0x31605
|
||||||
|
Itemid Atom = 0x38b06
|
||||||
|
Itemprop Atom = 0x19d08
|
||||||
|
Itemref Atom = 0x3cd07
|
||||||
|
Itemscope Atom = 0x67109
|
||||||
|
Itemtype Atom = 0x31f08
|
||||||
|
Kbd Atom = 0xb903
|
||||||
|
Keygen Atom = 0x3206
|
||||||
|
Keytype Atom = 0xd607
|
||||||
|
Kind Atom = 0x17704
|
||||||
|
Label Atom = 0x5905
|
||||||
|
Lang Atom = 0x2e404
|
||||||
|
Legend Atom = 0x18106
|
||||||
|
Li Atom = 0xb202
|
||||||
|
Link Atom = 0x17404
|
||||||
|
List Atom = 0x4a904
|
||||||
|
Listing Atom = 0x4a907
|
||||||
|
Loop Atom = 0x5d04
|
||||||
|
Low Atom = 0xc303
|
||||||
|
Main Atom = 0x1004
|
||||||
|
Malignmark Atom = 0xb00a
|
||||||
|
Manifest Atom = 0x6d708
|
||||||
|
Map Atom = 0x31803
|
||||||
|
Mark Atom = 0xb604
|
||||||
|
Marquee Atom = 0x32707
|
||||||
|
Math Atom = 0x32e04
|
||||||
|
Max Atom = 0x33d03
|
||||||
|
Maxlength Atom = 0x33d09
|
||||||
|
Media Atom = 0xe605
|
||||||
|
Mediagroup Atom = 0xe60a
|
||||||
|
Menu Atom = 0x38704
|
||||||
|
Menuitem Atom = 0x38708
|
||||||
|
Meta Atom = 0x4b804
|
||||||
|
Meter Atom = 0x9805
|
||||||
|
Method Atom = 0x2a806
|
||||||
|
Mglyph Atom = 0x30806
|
||||||
|
Mi Atom = 0x34702
|
||||||
|
Min Atom = 0x34703
|
||||||
|
Minlength Atom = 0x34709
|
||||||
|
Mn Atom = 0x2b102
|
||||||
|
Mo Atom = 0xa402
|
||||||
|
Ms Atom = 0x67402
|
||||||
|
Mtext Atom = 0x35105
|
||||||
|
Multiple Atom = 0x35f08
|
||||||
|
Muted Atom = 0x36705
|
||||||
|
Name Atom = 0x9604
|
||||||
|
Nav Atom = 0x1303
|
||||||
|
Nobr Atom = 0x3704
|
||||||
|
Noembed Atom = 0x6c07
|
||||||
|
Noframes Atom = 0x8908
|
||||||
|
Nomodule Atom = 0xa208
|
||||||
|
Nonce Atom = 0x1a605
|
||||||
|
Noscript Atom = 0x21608
|
||||||
|
Novalidate Atom = 0x2b20a
|
||||||
|
Object Atom = 0x26806
|
||||||
|
Ol Atom = 0x13702
|
||||||
|
Onabort Atom = 0x19507
|
||||||
|
Onafterprint Atom = 0x2360c
|
||||||
|
Onautocomplete Atom = 0x2760e
|
||||||
|
Onautocompleteerror Atom = 0x27613
|
||||||
|
Onauxclick Atom = 0x61f0a
|
||||||
|
Onbeforeprint Atom = 0x69e0d
|
||||||
|
Onbeforeunload Atom = 0x6e70e
|
||||||
|
Onblur Atom = 0x56d06
|
||||||
|
Oncancel Atom = 0x11908
|
||||||
|
Oncanplay Atom = 0x14d09
|
||||||
|
Oncanplaythrough Atom = 0x14d10
|
||||||
|
Onchange Atom = 0x41b08
|
||||||
|
Onclick Atom = 0x2f507
|
||||||
|
Onclose Atom = 0x36c07
|
||||||
|
Oncontextmenu Atom = 0x37e0d
|
||||||
|
Oncopy Atom = 0x39106
|
||||||
|
Oncuechange Atom = 0x3970b
|
||||||
|
Oncut Atom = 0x3a205
|
||||||
|
Ondblclick Atom = 0x3a70a
|
||||||
|
Ondrag Atom = 0x3b106
|
||||||
|
Ondragend Atom = 0x3b109
|
||||||
|
Ondragenter Atom = 0x3ba0b
|
||||||
|
Ondragexit Atom = 0x3c50a
|
||||||
|
Ondragleave Atom = 0x3df0b
|
||||||
|
Ondragover Atom = 0x3ea0a
|
||||||
|
Ondragstart Atom = 0x3f40b
|
||||||
|
Ondrop Atom = 0x40306
|
||||||
|
Ondurationchange Atom = 0x41310
|
||||||
|
Onemptied Atom = 0x40a09
|
||||||
|
Onended Atom = 0x42307
|
||||||
|
Onerror Atom = 0x42a07
|
||||||
|
Onfocus Atom = 0x43107
|
||||||
|
Onhashchange Atom = 0x43d0c
|
||||||
|
Oninput Atom = 0x44907
|
||||||
|
Oninvalid Atom = 0x45509
|
||||||
|
Onkeydown Atom = 0x45e09
|
||||||
|
Onkeypress Atom = 0x46b0a
|
||||||
|
Onkeyup Atom = 0x48007
|
||||||
|
Onlanguagechange Atom = 0x48d10
|
||||||
|
Onload Atom = 0x49d06
|
||||||
|
Onloadeddata Atom = 0x49d0c
|
||||||
|
Onloadedmetadata Atom = 0x4b010
|
||||||
|
Onloadend Atom = 0x4c609
|
||||||
|
Onloadstart Atom = 0x4cf0b
|
||||||
|
Onmessage Atom = 0x4da09
|
||||||
|
Onmessageerror Atom = 0x4da0e
|
||||||
|
Onmousedown Atom = 0x4e80b
|
||||||
|
Onmouseenter Atom = 0x4f30c
|
||||||
|
Onmouseleave Atom = 0x4ff0c
|
||||||
|
Onmousemove Atom = 0x50b0b
|
||||||
|
Onmouseout Atom = 0x5160a
|
||||||
|
Onmouseover Atom = 0x5230b
|
||||||
|
Onmouseup Atom = 0x52e09
|
||||||
|
Onmousewheel Atom = 0x53c0c
|
||||||
|
Onoffline Atom = 0x54809
|
||||||
|
Ononline Atom = 0x55108
|
||||||
|
Onpagehide Atom = 0x5590a
|
||||||
|
Onpageshow Atom = 0x5730a
|
||||||
|
Onpaste Atom = 0x57f07
|
||||||
|
Onpause Atom = 0x59a07
|
||||||
|
Onplay Atom = 0x5a406
|
||||||
|
Onplaying Atom = 0x5a409
|
||||||
|
Onpopstate Atom = 0x5ad0a
|
||||||
|
Onprogress Atom = 0x5b70a
|
||||||
|
Onratechange Atom = 0x5cc0c
|
||||||
|
Onrejectionhandled Atom = 0x5d812
|
||||||
|
Onreset Atom = 0x5ea07
|
||||||
|
Onresize Atom = 0x5f108
|
||||||
|
Onscroll Atom = 0x60008
|
||||||
|
Onsecuritypolicyviolation Atom = 0x60819
|
||||||
|
Onseeked Atom = 0x62908
|
||||||
|
Onseeking Atom = 0x63109
|
||||||
|
Onselect Atom = 0x63a08
|
||||||
|
Onshow Atom = 0x64406
|
||||||
|
Onsort Atom = 0x64f06
|
||||||
|
Onstalled Atom = 0x65909
|
||||||
|
Onstorage Atom = 0x66209
|
||||||
|
Onsubmit Atom = 0x66b08
|
||||||
|
Onsuspend Atom = 0x67b09
|
||||||
|
Ontimeupdate Atom = 0x400c
|
||||||
|
Ontoggle Atom = 0x68408
|
||||||
|
Onunhandledrejection Atom = 0x68c14
|
||||||
|
Onunload Atom = 0x6ab08
|
||||||
|
Onvolumechange Atom = 0x6b30e
|
||||||
|
Onwaiting Atom = 0x6c109
|
||||||
|
Onwheel Atom = 0x6ca07
|
||||||
|
Open Atom = 0x1a304
|
||||||
|
Optgroup Atom = 0x5f08
|
||||||
|
Optimum Atom = 0x6d107
|
||||||
|
Option Atom = 0x6e306
|
||||||
|
Output Atom = 0x51d06
|
||||||
|
P Atom = 0xc01
|
||||||
|
Param Atom = 0xc05
|
||||||
|
Pattern Atom = 0x6607
|
||||||
|
Picture Atom = 0x7b07
|
||||||
|
Ping Atom = 0xef04
|
||||||
|
Placeholder Atom = 0x1310b
|
||||||
|
Plaintext Atom = 0x1b209
|
||||||
|
Playsinline Atom = 0x1400b
|
||||||
|
Poster Atom = 0x2cf06
|
||||||
|
Pre Atom = 0x47003
|
||||||
|
Preload Atom = 0x48607
|
||||||
|
Progress Atom = 0x5b908
|
||||||
|
Prompt Atom = 0x53606
|
||||||
|
Public Atom = 0x58606
|
||||||
|
Q Atom = 0xcf01
|
||||||
|
Radiogroup Atom = 0x30a
|
||||||
|
Rb Atom = 0x3a02
|
||||||
|
Readonly Atom = 0x35708
|
||||||
|
Referrerpolicy Atom = 0x3d10e
|
||||||
|
Rel Atom = 0x48703
|
||||||
|
Required Atom = 0x24c08
|
||||||
|
Reversed Atom = 0x8008
|
||||||
|
Rows Atom = 0x9c04
|
||||||
|
Rowspan Atom = 0x9c07
|
||||||
|
Rp Atom = 0x23c02
|
||||||
|
Rt Atom = 0x19a02
|
||||||
|
Rtc Atom = 0x19a03
|
||||||
|
Ruby Atom = 0xfb04
|
||||||
|
S Atom = 0x2501
|
||||||
|
Samp Atom = 0x7804
|
||||||
|
Sandbox Atom = 0x12907
|
||||||
|
Scope Atom = 0x67505
|
||||||
|
Scoped Atom = 0x67506
|
||||||
|
Script Atom = 0x21806
|
||||||
|
Seamless Atom = 0x37108
|
||||||
|
Section Atom = 0x56807
|
||||||
|
Select Atom = 0x63c06
|
||||||
|
Selected Atom = 0x63c08
|
||||||
|
Shape Atom = 0x1e505
|
||||||
|
Size Atom = 0x5f504
|
||||||
|
Sizes Atom = 0x5f505
|
||||||
|
Slot Atom = 0x1ef04
|
||||||
|
Small Atom = 0x20605
|
||||||
|
Sortable Atom = 0x65108
|
||||||
|
Sorted Atom = 0x33706
|
||||||
|
Source Atom = 0x37806
|
||||||
|
Spacer Atom = 0x43706
|
||||||
|
Span Atom = 0x9f04
|
||||||
|
Spellcheck Atom = 0x4740a
|
||||||
|
Src Atom = 0x5c003
|
||||||
|
Srcdoc Atom = 0x5c006
|
||||||
|
Srclang Atom = 0x5f907
|
||||||
|
Srcset Atom = 0x6f906
|
||||||
|
Start Atom = 0x3fa05
|
||||||
|
Step Atom = 0x58304
|
||||||
|
Strike Atom = 0xd206
|
||||||
|
Strong Atom = 0x6dd06
|
||||||
|
Style Atom = 0x6ff05
|
||||||
|
Sub Atom = 0x66d03
|
||||||
|
Summary Atom = 0x70407
|
||||||
|
Sup Atom = 0x70b03
|
||||||
|
Svg Atom = 0x70e03
|
||||||
|
System Atom = 0x71106
|
||||||
|
Tabindex Atom = 0x4be08
|
||||||
|
Table Atom = 0x59505
|
||||||
|
Target Atom = 0x2c406
|
||||||
|
Tbody Atom = 0x2705
|
||||||
|
Td Atom = 0x9202
|
||||||
|
Template Atom = 0x71408
|
||||||
|
Textarea Atom = 0x35208
|
||||||
|
Tfoot Atom = 0xf505
|
||||||
|
Th Atom = 0x15602
|
||||||
|
Thead Atom = 0x33005
|
||||||
|
Time Atom = 0x4204
|
||||||
|
Title Atom = 0x11005
|
||||||
|
Tr Atom = 0xcc02
|
||||||
|
Track Atom = 0x1ba05
|
||||||
|
Translate Atom = 0x1f209
|
||||||
|
Tt Atom = 0x6802
|
||||||
|
Type Atom = 0xd904
|
||||||
|
Typemustmatch Atom = 0x2900d
|
||||||
|
U Atom = 0xb01
|
||||||
|
Ul Atom = 0xa702
|
||||||
|
Updateviacache Atom = 0x460e
|
||||||
|
Usemap Atom = 0x59e06
|
||||||
|
Value Atom = 0x1505
|
||||||
|
Var Atom = 0x16d03
|
||||||
|
Video Atom = 0x2f105
|
||||||
|
Wbr Atom = 0x57c03
|
||||||
|
Width Atom = 0x64905
|
||||||
|
Workertype Atom = 0x71c0a
|
||||||
|
Wrap Atom = 0x72604
|
||||||
|
Xmp Atom = 0x12f03
|
||||||
|
)
|
||||||
|
|
||||||
|
const hash0 = 0x81cdf10e
|
||||||
|
|
||||||
|
const maxAtomLen = 25
|
||||||
|
|
||||||
|
var table = [1 << 9]Atom{
|
||||||
|
0x1: 0xe60a, // mediagroup
|
||||||
|
0x2: 0x2e404, // lang
|
||||||
|
0x4: 0x2c09, // accesskey
|
||||||
|
0x5: 0x8b08, // frameset
|
||||||
|
0x7: 0x63a08, // onselect
|
||||||
|
0x8: 0x71106, // system
|
||||||
|
0xa: 0x64905, // width
|
||||||
|
0xc: 0x2890b, // formenctype
|
||||||
|
0xd: 0x13702, // ol
|
||||||
|
0xe: 0x3970b, // oncuechange
|
||||||
|
0x10: 0x14b03, // bdo
|
||||||
|
0x11: 0x11505, // audio
|
||||||
|
0x12: 0x17a09, // draggable
|
||||||
|
0x14: 0x2f105, // video
|
||||||
|
0x15: 0x2b102, // mn
|
||||||
|
0x16: 0x38704, // menu
|
||||||
|
0x17: 0x2cf06, // poster
|
||||||
|
0x19: 0xf606, // footer
|
||||||
|
0x1a: 0x2a806, // method
|
||||||
|
0x1b: 0x2b808, // datetime
|
||||||
|
0x1c: 0x19507, // onabort
|
||||||
|
0x1d: 0x460e, // updateviacache
|
||||||
|
0x1e: 0xff05, // async
|
||||||
|
0x1f: 0x49d06, // onload
|
||||||
|
0x21: 0x11908, // oncancel
|
||||||
|
0x22: 0x62908, // onseeked
|
||||||
|
0x23: 0x30205, // image
|
||||||
|
0x24: 0x5d812, // onrejectionhandled
|
||||||
|
0x26: 0x17404, // link
|
||||||
|
0x27: 0x51d06, // output
|
||||||
|
0x28: 0x33104, // head
|
||||||
|
0x29: 0x4ff0c, // onmouseleave
|
||||||
|
0x2a: 0x57f07, // onpaste
|
||||||
|
0x2b: 0x5a409, // onplaying
|
||||||
|
0x2c: 0x1c407, // colspan
|
||||||
|
0x2f: 0x1bf05, // color
|
||||||
|
0x30: 0x5f504, // size
|
||||||
|
0x31: 0x2e80a, // http-equiv
|
||||||
|
0x33: 0x601, // i
|
||||||
|
0x34: 0x5590a, // onpagehide
|
||||||
|
0x35: 0x68c14, // onunhandledrejection
|
||||||
|
0x37: 0x42a07, // onerror
|
||||||
|
0x3a: 0x3b08, // basefont
|
||||||
|
0x3f: 0x1303, // nav
|
||||||
|
0x40: 0x17704, // kind
|
||||||
|
0x41: 0x35708, // readonly
|
||||||
|
0x42: 0x30806, // mglyph
|
||||||
|
0x44: 0xb202, // li
|
||||||
|
0x46: 0x2d506, // hidden
|
||||||
|
0x47: 0x70e03, // svg
|
||||||
|
0x48: 0x58304, // step
|
||||||
|
0x49: 0x23f09, // integrity
|
||||||
|
0x4a: 0x58606, // public
|
||||||
|
0x4c: 0x1ab03, // col
|
||||||
|
0x4d: 0x1870a, // blockquote
|
||||||
|
0x4e: 0x34f02, // h5
|
||||||
|
0x50: 0x5b908, // progress
|
||||||
|
0x51: 0x5f505, // sizes
|
||||||
|
0x52: 0x34502, // h4
|
||||||
|
0x56: 0x33005, // thead
|
||||||
|
0x57: 0xd607, // keytype
|
||||||
|
0x58: 0x5b70a, // onprogress
|
||||||
|
0x59: 0x44b09, // inputmode
|
||||||
|
0x5a: 0x3b109, // ondragend
|
||||||
|
0x5d: 0x3a205, // oncut
|
||||||
|
0x5e: 0x43706, // spacer
|
||||||
|
0x5f: 0x1ab08, // colgroup
|
||||||
|
0x62: 0x16502, // is
|
||||||
|
0x65: 0x3c02, // as
|
||||||
|
0x66: 0x54809, // onoffline
|
||||||
|
0x67: 0x33706, // sorted
|
||||||
|
0x69: 0x48d10, // onlanguagechange
|
||||||
|
0x6c: 0x43d0c, // onhashchange
|
||||||
|
0x6d: 0x9604, // name
|
||||||
|
0x6e: 0xf505, // tfoot
|
||||||
|
0x6f: 0x56104, // desc
|
||||||
|
0x70: 0x33d03, // max
|
||||||
|
0x72: 0x1ea06, // coords
|
||||||
|
0x73: 0x30d02, // h3
|
||||||
|
0x74: 0x6e70e, // onbeforeunload
|
||||||
|
0x75: 0x9c04, // rows
|
||||||
|
0x76: 0x63c06, // select
|
||||||
|
0x77: 0x9805, // meter
|
||||||
|
0x78: 0x38b06, // itemid
|
||||||
|
0x79: 0x53c0c, // onmousewheel
|
||||||
|
0x7a: 0x5c006, // srcdoc
|
||||||
|
0x7d: 0x1ba05, // track
|
||||||
|
0x7f: 0x31f08, // itemtype
|
||||||
|
0x82: 0xa402, // mo
|
||||||
|
0x83: 0x41b08, // onchange
|
||||||
|
0x84: 0x33107, // headers
|
||||||
|
0x85: 0x5cc0c, // onratechange
|
||||||
|
0x86: 0x60819, // onsecuritypolicyviolation
|
||||||
|
0x88: 0x4a508, // datalist
|
||||||
|
0x89: 0x4e80b, // onmousedown
|
||||||
|
0x8a: 0x1ef04, // slot
|
||||||
|
0x8b: 0x4b010, // onloadedmetadata
|
||||||
|
0x8c: 0x1a06, // accept
|
||||||
|
0x8d: 0x26806, // object
|
||||||
|
0x91: 0x6b30e, // onvolumechange
|
||||||
|
0x92: 0x2107, // charset
|
||||||
|
0x93: 0x27613, // onautocompleteerror
|
||||||
|
0x94: 0xc113, // allowpaymentrequest
|
||||||
|
0x95: 0x2804, // body
|
||||||
|
0x96: 0x10a07, // default
|
||||||
|
0x97: 0x63c08, // selected
|
||||||
|
0x98: 0x21e04, // face
|
||||||
|
0x99: 0x1e505, // shape
|
||||||
|
0x9b: 0x68408, // ontoggle
|
||||||
|
0x9e: 0x64b02, // dt
|
||||||
|
0x9f: 0xb604, // mark
|
||||||
|
0xa1: 0xb01, // u
|
||||||
|
0xa4: 0x6ab08, // onunload
|
||||||
|
0xa5: 0x5d04, // loop
|
||||||
|
0xa6: 0x16408, // disabled
|
||||||
|
0xaa: 0x42307, // onended
|
||||||
|
0xab: 0xb00a, // malignmark
|
||||||
|
0xad: 0x67b09, // onsuspend
|
||||||
|
0xae: 0x35105, // mtext
|
||||||
|
0xaf: 0x64f06, // onsort
|
||||||
|
0xb0: 0x19d08, // itemprop
|
||||||
|
0xb3: 0x67109, // itemscope
|
||||||
|
0xb4: 0x17305, // blink
|
||||||
|
0xb6: 0x3b106, // ondrag
|
||||||
|
0xb7: 0xa702, // ul
|
||||||
|
0xb8: 0x26e04, // form
|
||||||
|
0xb9: 0x12907, // sandbox
|
||||||
|
0xba: 0x8b05, // frame
|
||||||
|
0xbb: 0x1505, // value
|
||||||
|
0xbc: 0x66209, // onstorage
|
||||||
|
0xbf: 0xaa07, // acronym
|
||||||
|
0xc0: 0x19a02, // rt
|
||||||
|
0xc2: 0x202, // br
|
||||||
|
0xc3: 0x22608, // fieldset
|
||||||
|
0xc4: 0x2900d, // typemustmatch
|
||||||
|
0xc5: 0xa208, // nomodule
|
||||||
|
0xc6: 0x6c07, // noembed
|
||||||
|
0xc7: 0x69e0d, // onbeforeprint
|
||||||
|
0xc8: 0x19106, // button
|
||||||
|
0xc9: 0x2f507, // onclick
|
||||||
|
0xca: 0x70407, // summary
|
||||||
|
0xcd: 0xfb04, // ruby
|
||||||
|
0xce: 0x56405, // class
|
||||||
|
0xcf: 0x3f40b, // ondragstart
|
||||||
|
0xd0: 0x23107, // caption
|
||||||
|
0xd4: 0xdd0e, // allowusermedia
|
||||||
|
0xd5: 0x4cf0b, // onloadstart
|
||||||
|
0xd9: 0x16b03, // div
|
||||||
|
0xda: 0x4a904, // list
|
||||||
|
0xdb: 0x32e04, // math
|
||||||
|
0xdc: 0x44b05, // input
|
||||||
|
0xdf: 0x3ea0a, // ondragover
|
||||||
|
0xe0: 0x2de02, // h2
|
||||||
|
0xe2: 0x1b209, // plaintext
|
||||||
|
0xe4: 0x4f30c, // onmouseenter
|
||||||
|
0xe7: 0x47907, // checked
|
||||||
|
0xe8: 0x47003, // pre
|
||||||
|
0xea: 0x35f08, // multiple
|
||||||
|
0xeb: 0xba03, // bdi
|
||||||
|
0xec: 0x33d09, // maxlength
|
||||||
|
0xed: 0xcf01, // q
|
||||||
|
0xee: 0x61f0a, // onauxclick
|
||||||
|
0xf0: 0x57c03, // wbr
|
||||||
|
0xf2: 0x3b04, // base
|
||||||
|
0xf3: 0x6e306, // option
|
||||||
|
0xf5: 0x41310, // ondurationchange
|
||||||
|
0xf7: 0x8908, // noframes
|
||||||
|
0xf9: 0x40508, // dropzone
|
||||||
|
0xfb: 0x67505, // scope
|
||||||
|
0xfc: 0x8008, // reversed
|
||||||
|
0xfd: 0x3ba0b, // ondragenter
|
||||||
|
0xfe: 0x3fa05, // start
|
||||||
|
0xff: 0x12f03, // xmp
|
||||||
|
0x100: 0x5f907, // srclang
|
||||||
|
0x101: 0x30703, // img
|
||||||
|
0x104: 0x101, // b
|
||||||
|
0x105: 0x25403, // for
|
||||||
|
0x106: 0x10705, // aside
|
||||||
|
0x107: 0x44907, // oninput
|
||||||
|
0x108: 0x35604, // area
|
||||||
|
0x109: 0x2a40a, // formmethod
|
||||||
|
0x10a: 0x72604, // wrap
|
||||||
|
0x10c: 0x23c02, // rp
|
||||||
|
0x10d: 0x46b0a, // onkeypress
|
||||||
|
0x10e: 0x6802, // tt
|
||||||
|
0x110: 0x34702, // mi
|
||||||
|
0x111: 0x36705, // muted
|
||||||
|
0x112: 0xf303, // alt
|
||||||
|
0x113: 0x5c504, // code
|
||||||
|
0x114: 0x6e02, // em
|
||||||
|
0x115: 0x3c50a, // ondragexit
|
||||||
|
0x117: 0x9f04, // span
|
||||||
|
0x119: 0x6d708, // manifest
|
||||||
|
0x11a: 0x38708, // menuitem
|
||||||
|
0x11b: 0x58b07, // content
|
||||||
|
0x11d: 0x6c109, // onwaiting
|
||||||
|
0x11f: 0x4c609, // onloadend
|
||||||
|
0x121: 0x37e0d, // oncontextmenu
|
||||||
|
0x123: 0x56d06, // onblur
|
||||||
|
0x124: 0x3fc07, // article
|
||||||
|
0x125: 0x9303, // dir
|
||||||
|
0x126: 0xef04, // ping
|
||||||
|
0x127: 0x24c08, // required
|
||||||
|
0x128: 0x45509, // oninvalid
|
||||||
|
0x129: 0xb105, // align
|
||||||
|
0x12b: 0x58a04, // icon
|
||||||
|
0x12c: 0x64d02, // h6
|
||||||
|
0x12d: 0x1c404, // cols
|
||||||
|
0x12e: 0x22e0a, // figcaption
|
||||||
|
0x12f: 0x45e09, // onkeydown
|
||||||
|
0x130: 0x66b08, // onsubmit
|
||||||
|
0x131: 0x14d09, // oncanplay
|
||||||
|
0x132: 0x70b03, // sup
|
||||||
|
0x133: 0xc01, // p
|
||||||
|
0x135: 0x40a09, // onemptied
|
||||||
|
0x136: 0x39106, // oncopy
|
||||||
|
0x137: 0x19c04, // cite
|
||||||
|
0x138: 0x3a70a, // ondblclick
|
||||||
|
0x13a: 0x50b0b, // onmousemove
|
||||||
|
0x13c: 0x66d03, // sub
|
||||||
|
0x13d: 0x48703, // rel
|
||||||
|
0x13e: 0x5f08, // optgroup
|
||||||
|
0x142: 0x9c07, // rowspan
|
||||||
|
0x143: 0x37806, // source
|
||||||
|
0x144: 0x21608, // noscript
|
||||||
|
0x145: 0x1a304, // open
|
||||||
|
0x146: 0x20403, // ins
|
||||||
|
0x147: 0x2540d, // foreignObject
|
||||||
|
0x148: 0x5ad0a, // onpopstate
|
||||||
|
0x14a: 0x28d07, // enctype
|
||||||
|
0x14b: 0x2760e, // onautocomplete
|
||||||
|
0x14c: 0x35208, // textarea
|
||||||
|
0x14e: 0x2780c, // autocomplete
|
||||||
|
0x14f: 0x15702, // hr
|
||||||
|
0x150: 0x1de08, // controls
|
||||||
|
0x151: 0x10902, // id
|
||||||
|
0x153: 0x2360c, // onafterprint
|
||||||
|
0x155: 0x2610d, // foreignobject
|
||||||
|
0x156: 0x32707, // marquee
|
||||||
|
0x157: 0x59a07, // onpause
|
||||||
|
0x158: 0x5e602, // dl
|
||||||
|
0x159: 0x5206, // height
|
||||||
|
0x15a: 0x34703, // min
|
||||||
|
0x15b: 0x9307, // dirname
|
||||||
|
0x15c: 0x1f209, // translate
|
||||||
|
0x15d: 0x5604, // html
|
||||||
|
0x15e: 0x34709, // minlength
|
||||||
|
0x15f: 0x48607, // preload
|
||||||
|
0x160: 0x71408, // template
|
||||||
|
0x161: 0x3df0b, // ondragleave
|
||||||
|
0x162: 0x3a02, // rb
|
||||||
|
0x164: 0x5c003, // src
|
||||||
|
0x165: 0x6dd06, // strong
|
||||||
|
0x167: 0x7804, // samp
|
||||||
|
0x168: 0x6f307, // address
|
||||||
|
0x169: 0x55108, // ononline
|
||||||
|
0x16b: 0x1310b, // placeholder
|
||||||
|
0x16c: 0x2c406, // target
|
||||||
|
0x16d: 0x20605, // small
|
||||||
|
0x16e: 0x6ca07, // onwheel
|
||||||
|
0x16f: 0x1c90a, // annotation
|
||||||
|
0x170: 0x4740a, // spellcheck
|
||||||
|
0x171: 0x7207, // details
|
||||||
|
0x172: 0x10306, // canvas
|
||||||
|
0x173: 0x12109, // autofocus
|
||||||
|
0x174: 0xc05, // param
|
||||||
|
0x176: 0x46308, // download
|
||||||
|
0x177: 0x45203, // del
|
||||||
|
0x178: 0x36c07, // onclose
|
||||||
|
0x179: 0xb903, // kbd
|
||||||
|
0x17a: 0x31906, // applet
|
||||||
|
0x17b: 0x2e004, // href
|
||||||
|
0x17c: 0x5f108, // onresize
|
||||||
|
0x17e: 0x49d0c, // onloadeddata
|
||||||
|
0x180: 0xcc02, // tr
|
||||||
|
0x181: 0x2c00a, // formtarget
|
||||||
|
0x182: 0x11005, // title
|
||||||
|
0x183: 0x6ff05, // style
|
||||||
|
0x184: 0xd206, // strike
|
||||||
|
0x185: 0x59e06, // usemap
|
||||||
|
0x186: 0x2fc06, // iframe
|
||||||
|
0x187: 0x1004, // main
|
||||||
|
0x189: 0x7b07, // picture
|
||||||
|
0x18c: 0x31605, // ismap
|
||||||
|
0x18e: 0x4a504, // data
|
||||||
|
0x18f: 0x5905, // label
|
||||||
|
0x191: 0x3d10e, // referrerpolicy
|
||||||
|
0x192: 0x15602, // th
|
||||||
|
0x194: 0x53606, // prompt
|
||||||
|
0x195: 0x56807, // section
|
||||||
|
0x197: 0x6d107, // optimum
|
||||||
|
0x198: 0x2db04, // high
|
||||||
|
0x199: 0x15c02, // h1
|
||||||
|
0x19a: 0x65909, // onstalled
|
||||||
|
0x19b: 0x16d03, // var
|
||||||
|
0x19c: 0x4204, // time
|
||||||
|
0x19e: 0x67402, // ms
|
||||||
|
0x19f: 0x33106, // header
|
||||||
|
0x1a0: 0x4da09, // onmessage
|
||||||
|
0x1a1: 0x1a605, // nonce
|
||||||
|
0x1a2: 0x26e0a, // formaction
|
||||||
|
0x1a3: 0x22006, // center
|
||||||
|
0x1a4: 0x3704, // nobr
|
||||||
|
0x1a5: 0x59505, // table
|
||||||
|
0x1a6: 0x4a907, // listing
|
||||||
|
0x1a7: 0x18106, // legend
|
||||||
|
0x1a9: 0x29b09, // challenge
|
||||||
|
0x1aa: 0x24806, // figure
|
||||||
|
0x1ab: 0xe605, // media
|
||||||
|
0x1ae: 0xd904, // type
|
||||||
|
0x1af: 0x3f04, // font
|
||||||
|
0x1b0: 0x4da0e, // onmessageerror
|
||||||
|
0x1b1: 0x37108, // seamless
|
||||||
|
0x1b2: 0x8703, // dfn
|
||||||
|
0x1b3: 0x5c705, // defer
|
||||||
|
0x1b4: 0xc303, // low
|
||||||
|
0x1b5: 0x19a03, // rtc
|
||||||
|
0x1b6: 0x5230b, // onmouseover
|
||||||
|
0x1b7: 0x2b20a, // novalidate
|
||||||
|
0x1b8: 0x71c0a, // workertype
|
||||||
|
0x1ba: 0x3cd07, // itemref
|
||||||
|
0x1bd: 0x1, // a
|
||||||
|
0x1be: 0x31803, // map
|
||||||
|
0x1bf: 0x400c, // ontimeupdate
|
||||||
|
0x1c0: 0x15e07, // bgsound
|
||||||
|
0x1c1: 0x3206, // keygen
|
||||||
|
0x1c2: 0x2705, // tbody
|
||||||
|
0x1c5: 0x64406, // onshow
|
||||||
|
0x1c7: 0x2501, // s
|
||||||
|
0x1c8: 0x6607, // pattern
|
||||||
|
0x1cc: 0x14d10, // oncanplaythrough
|
||||||
|
0x1ce: 0x2d702, // dd
|
||||||
|
0x1cf: 0x6f906, // srcset
|
||||||
|
0x1d0: 0x17003, // big
|
||||||
|
0x1d2: 0x65108, // sortable
|
||||||
|
0x1d3: 0x48007, // onkeyup
|
||||||
|
0x1d5: 0x5a406, // onplay
|
||||||
|
0x1d7: 0x4b804, // meta
|
||||||
|
0x1d8: 0x40306, // ondrop
|
||||||
|
0x1da: 0x60008, // onscroll
|
||||||
|
0x1db: 0x1fb0b, // crossorigin
|
||||||
|
0x1dc: 0x5730a, // onpageshow
|
||||||
|
0x1dd: 0x4, // abbr
|
||||||
|
0x1de: 0x9202, // td
|
||||||
|
0x1df: 0x58b0f, // contenteditable
|
||||||
|
0x1e0: 0x27206, // action
|
||||||
|
0x1e1: 0x1400b, // playsinline
|
||||||
|
0x1e2: 0x43107, // onfocus
|
||||||
|
0x1e3: 0x2e008, // hreflang
|
||||||
|
0x1e5: 0x5160a, // onmouseout
|
||||||
|
0x1e6: 0x5ea07, // onreset
|
||||||
|
0x1e7: 0x13c08, // autoplay
|
||||||
|
0x1e8: 0x63109, // onseeking
|
||||||
|
0x1ea: 0x67506, // scoped
|
||||||
|
0x1ec: 0x30a, // radiogroup
|
||||||
|
0x1ee: 0x3800b, // contextmenu
|
||||||
|
0x1ef: 0x52e09, // onmouseup
|
||||||
|
0x1f1: 0x2ca06, // hgroup
|
||||||
|
0x1f2: 0x2080f, // allowfullscreen
|
||||||
|
0x1f3: 0x4be08, // tabindex
|
||||||
|
0x1f6: 0x30f07, // isindex
|
||||||
|
0x1f7: 0x1a0e, // accept-charset
|
||||||
|
0x1f8: 0x2ae0e, // formnovalidate
|
||||||
|
0x1fb: 0x1c90e, // annotation-xml
|
||||||
|
0x1fc: 0x6e05, // embed
|
||||||
|
0x1fd: 0x21806, // script
|
||||||
|
0x1fe: 0xbb06, // dialog
|
||||||
|
0x1ff: 0x1d707, // command
|
||||||
|
}
|
||||||
|
|
||||||
|
const atomText = "abbradiogrouparamainavalueaccept-charsetbodyaccesskeygenobrb" +
|
||||||
|
"asefontimeupdateviacacheightmlabelooptgroupatternoembedetail" +
|
||||||
|
"sampictureversedfnoframesetdirnameterowspanomoduleacronymali" +
|
||||||
|
"gnmarkbdialogallowpaymentrequestrikeytypeallowusermediagroup" +
|
||||||
|
"ingaltfooterubyasyncanvasidefaultitleaudioncancelautofocusan" +
|
||||||
|
"dboxmplaceholderautoplaysinlinebdoncanplaythrough1bgsoundisa" +
|
||||||
|
"bledivarbigblinkindraggablegendblockquotebuttonabortcitempro" +
|
||||||
|
"penoncecolgrouplaintextrackcolorcolspannotation-xmlcommandco" +
|
||||||
|
"ntrolshapecoordslotranslatecrossoriginsmallowfullscreenoscri" +
|
||||||
|
"ptfacenterfieldsetfigcaptionafterprintegrityfigurequiredfore" +
|
||||||
|
"ignObjectforeignobjectformactionautocompleteerrorformenctype" +
|
||||||
|
"mustmatchallengeformmethodformnovalidatetimeformtargethgroup" +
|
||||||
|
"osterhiddenhigh2hreflanghttp-equivideonclickiframeimageimgly" +
|
||||||
|
"ph3isindexismappletitemtypemarqueematheadersortedmaxlength4m" +
|
||||||
|
"inlength5mtextareadonlymultiplemutedoncloseamlessourceoncont" +
|
||||||
|
"extmenuitemidoncopyoncuechangeoncutondblclickondragendondrag" +
|
||||||
|
"enterondragexitemreferrerpolicyondragleaveondragoverondragst" +
|
||||||
|
"articleondropzonemptiedondurationchangeonendedonerroronfocus" +
|
||||||
|
"paceronhashchangeoninputmodeloninvalidonkeydownloadonkeypres" +
|
||||||
|
"spellcheckedonkeyupreloadonlanguagechangeonloadeddatalisting" +
|
||||||
|
"onloadedmetadatabindexonloadendonloadstartonmessageerroronmo" +
|
||||||
|
"usedownonmouseenteronmouseleaveonmousemoveonmouseoutputonmou" +
|
||||||
|
"seoveronmouseupromptonmousewheelonofflineononlineonpagehides" +
|
||||||
|
"classectionbluronpageshowbronpastepublicontenteditableonpaus" +
|
||||||
|
"emaponplayingonpopstateonprogressrcdocodeferonratechangeonre" +
|
||||||
|
"jectionhandledonresetonresizesrclangonscrollonsecuritypolicy" +
|
||||||
|
"violationauxclickonseekedonseekingonselectedonshowidth6onsor" +
|
||||||
|
"tableonstalledonstorageonsubmitemscopedonsuspendontoggleonun" +
|
||||||
|
"handledrejectionbeforeprintonunloadonvolumechangeonwaitingon" +
|
||||||
|
"wheeloptimumanifestrongoptionbeforeunloaddressrcsetstylesumm" +
|
||||||
|
"arysupsvgsystemplateworkertypewrap"
|
112
vendor/golang.org/x/net/html/const.go
generated
vendored
Normal file
112
vendor/golang.org/x/net/html/const.go
generated
vendored
Normal file
|
@ -0,0 +1,112 @@
|
||||||
|
// Copyright 2011 The Go Authors. All rights reserved.
|
||||||
|
// Use of this source code is governed by a BSD-style
|
||||||
|
// license that can be found in the LICENSE file.
|
||||||
|
|
||||||
|
package html
|
||||||
|
|
||||||
|
// Section 12.2.4.2 of the HTML5 specification says "The following elements
|
||||||
|
// have varying levels of special parsing rules".
|
||||||
|
// https://html.spec.whatwg.org/multipage/syntax.html#the-stack-of-open-elements
|
||||||
|
var isSpecialElementMap = map[string]bool{
|
||||||
|
"address": true,
|
||||||
|
"applet": true,
|
||||||
|
"area": true,
|
||||||
|
"article": true,
|
||||||
|
"aside": true,
|
||||||
|
"base": true,
|
||||||
|
"basefont": true,
|
||||||
|
"bgsound": true,
|
||||||
|
"blockquote": true,
|
||||||
|
"body": true,
|
||||||
|
"br": true,
|
||||||
|
"button": true,
|
||||||
|
"caption": true,
|
||||||
|
"center": true,
|
||||||
|
"col": true,
|
||||||
|
"colgroup": true,
|
||||||
|
"dd": true,
|
||||||
|
"details": true,
|
||||||
|
"dir": true,
|
||||||
|
"div": true,
|
||||||
|
"dl": true,
|
||||||
|
"dt": true,
|
||||||
|
"embed": true,
|
||||||
|
"fieldset": true,
|
||||||
|
"figcaption": true,
|
||||||
|
"figure": true,
|
||||||
|
"footer": true,
|
||||||
|
"form": true,
|
||||||
|
"frame": true,
|
||||||
|
"frameset": true,
|
||||||
|
"h1": true,
|
||||||
|
"h2": true,
|
||||||
|
"h3": true,
|
||||||
|
"h4": true,
|
||||||
|
"h5": true,
|
||||||
|
"h6": true,
|
||||||
|
"head": true,
|
||||||
|
"header": true,
|
||||||
|
"hgroup": true,
|
||||||
|
"hr": true,
|
||||||
|
"html": true,
|
||||||
|
"iframe": true,
|
||||||
|
"img": true,
|
||||||
|
"input": true,
|
||||||
|
"isindex": true, // The 'isindex' element has been removed, but keep it for backwards compatibility.
|
||||||
|
"keygen": true,
|
||||||
|
"li": true,
|
||||||
|
"link": true,
|
||||||
|
"listing": true,
|
||||||
|
"main": true,
|
||||||
|
"marquee": true,
|
||||||
|
"menu": true,
|
||||||
|
"meta": true,
|
||||||
|
"nav": true,
|
||||||
|
"noembed": true,
|
||||||
|
"noframes": true,
|
||||||
|
"noscript": true,
|
||||||
|
"object": true,
|
||||||
|
"ol": true,
|
||||||
|
"p": true,
|
||||||
|
"param": true,
|
||||||
|
"plaintext": true,
|
||||||
|
"pre": true,
|
||||||
|
"script": true,
|
||||||
|
"section": true,
|
||||||
|
"select": true,
|
||||||
|
"source": true,
|
||||||
|
"style": true,
|
||||||
|
"summary": true,
|
||||||
|
"table": true,
|
||||||
|
"tbody": true,
|
||||||
|
"td": true,
|
||||||
|
"template": true,
|
||||||
|
"textarea": true,
|
||||||
|
"tfoot": true,
|
||||||
|
"th": true,
|
||||||
|
"thead": true,
|
||||||
|
"title": true,
|
||||||
|
"tr": true,
|
||||||
|
"track": true,
|
||||||
|
"ul": true,
|
||||||
|
"wbr": true,
|
||||||
|
"xmp": true,
|
||||||
|
}
|
||||||
|
|
||||||
|
func isSpecialElement(element *Node) bool {
|
||||||
|
switch element.Namespace {
|
||||||
|
case "", "html":
|
||||||
|
return isSpecialElementMap[element.Data]
|
||||||
|
case "math":
|
||||||
|
switch element.Data {
|
||||||
|
case "mi", "mo", "mn", "ms", "mtext", "annotation-xml":
|
||||||
|
return true
|
||||||
|
}
|
||||||
|
case "svg":
|
||||||
|
switch element.Data {
|
||||||
|
case "foreignObject", "desc", "title":
|
||||||
|
return true
|
||||||
|
}
|
||||||
|
}
|
||||||
|
return false
|
||||||
|
}
|
106
vendor/golang.org/x/net/html/doc.go
generated
vendored
Normal file
106
vendor/golang.org/x/net/html/doc.go
generated
vendored
Normal file
|
@ -0,0 +1,106 @@
|
||||||
|
// Copyright 2010 The Go Authors. All rights reserved.
|
||||||
|
// Use of this source code is governed by a BSD-style
|
||||||
|
// license that can be found in the LICENSE file.
|
||||||
|
|
||||||
|
/*
|
||||||
|
Package html implements an HTML5-compliant tokenizer and parser.
|
||||||
|
|
||||||
|
Tokenization is done by creating a Tokenizer for an io.Reader r. It is the
|
||||||
|
caller's responsibility to ensure that r provides UTF-8 encoded HTML.
|
||||||
|
|
||||||
|
z := html.NewTokenizer(r)
|
||||||
|
|
||||||
|
Given a Tokenizer z, the HTML is tokenized by repeatedly calling z.Next(),
|
||||||
|
which parses the next token and returns its type, or an error:
|
||||||
|
|
||||||
|
for {
|
||||||
|
tt := z.Next()
|
||||||
|
if tt == html.ErrorToken {
|
||||||
|
// ...
|
||||||
|
return ...
|
||||||
|
}
|
||||||
|
// Process the current token.
|
||||||
|
}
|
||||||
|
|
||||||
|
There are two APIs for retrieving the current token. The high-level API is to
|
||||||
|
call Token; the low-level API is to call Text or TagName / TagAttr. Both APIs
|
||||||
|
allow optionally calling Raw after Next but before Token, Text, TagName, or
|
||||||
|
TagAttr. In EBNF notation, the valid call sequence per token is:
|
||||||
|
|
||||||
|
Next {Raw} [ Token | Text | TagName {TagAttr} ]
|
||||||
|
|
||||||
|
Token returns an independent data structure that completely describes a token.
|
||||||
|
Entities (such as "<") are unescaped, tag names and attribute keys are
|
||||||
|
lower-cased, and attributes are collected into a []Attribute. For example:
|
||||||
|
|
||||||
|
for {
|
||||||
|
if z.Next() == html.ErrorToken {
|
||||||
|
// Returning io.EOF indicates success.
|
||||||
|
return z.Err()
|
||||||
|
}
|
||||||
|
emitToken(z.Token())
|
||||||
|
}
|
||||||
|
|
||||||
|
The low-level API performs fewer allocations and copies, but the contents of
|
||||||
|
the []byte values returned by Text, TagName and TagAttr may change on the next
|
||||||
|
call to Next. For example, to extract an HTML page's anchor text:
|
||||||
|
|
||||||
|
depth := 0
|
||||||
|
for {
|
||||||
|
tt := z.Next()
|
||||||
|
switch tt {
|
||||||
|
case html.ErrorToken:
|
||||||
|
return z.Err()
|
||||||
|
case html.TextToken:
|
||||||
|
if depth > 0 {
|
||||||
|
// emitBytes should copy the []byte it receives,
|
||||||
|
// if it doesn't process it immediately.
|
||||||
|
emitBytes(z.Text())
|
||||||
|
}
|
||||||
|
case html.StartTagToken, html.EndTagToken:
|
||||||
|
tn, _ := z.TagName()
|
||||||
|
if len(tn) == 1 && tn[0] == 'a' {
|
||||||
|
if tt == html.StartTagToken {
|
||||||
|
depth++
|
||||||
|
} else {
|
||||||
|
depth--
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
Parsing is done by calling Parse with an io.Reader, which returns the root of
|
||||||
|
the parse tree (the document element) as a *Node. It is the caller's
|
||||||
|
responsibility to ensure that the Reader provides UTF-8 encoded HTML. For
|
||||||
|
example, to process each anchor node in depth-first order:
|
||||||
|
|
||||||
|
doc, err := html.Parse(r)
|
||||||
|
if err != nil {
|
||||||
|
// ...
|
||||||
|
}
|
||||||
|
var f func(*html.Node)
|
||||||
|
f = func(n *html.Node) {
|
||||||
|
if n.Type == html.ElementNode && n.Data == "a" {
|
||||||
|
// Do something with n...
|
||||||
|
}
|
||||||
|
for c := n.FirstChild; c != nil; c = c.NextSibling {
|
||||||
|
f(c)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
f(doc)
|
||||||
|
|
||||||
|
The relevant specifications include:
|
||||||
|
https://html.spec.whatwg.org/multipage/syntax.html and
|
||||||
|
https://html.spec.whatwg.org/multipage/syntax.html#tokenization
|
||||||
|
*/
|
||||||
|
package html // import "golang.org/x/net/html"
|
||||||
|
|
||||||
|
// The tokenization algorithm implemented by this package is not a line-by-line
|
||||||
|
// transliteration of the relatively verbose state-machine in the WHATWG
|
||||||
|
// specification. A more direct approach is used instead, where the program
|
||||||
|
// counter implies the state, such as whether it is tokenizing a tag or a text
|
||||||
|
// node. Specification compliance is verified by checking expected and actual
|
||||||
|
// outputs over a test suite rather than aiming for algorithmic fidelity.
|
||||||
|
|
||||||
|
// TODO(nigeltao): Does a DOM API belong in this package or a separate one?
|
||||||
|
// TODO(nigeltao): How does parsing interact with a JavaScript engine?
|
156
vendor/golang.org/x/net/html/doctype.go
generated
vendored
Normal file
156
vendor/golang.org/x/net/html/doctype.go
generated
vendored
Normal file
|
@ -0,0 +1,156 @@
|
||||||
|
// Copyright 2011 The Go Authors. All rights reserved.
|
||||||
|
// Use of this source code is governed by a BSD-style
|
||||||
|
// license that can be found in the LICENSE file.
|
||||||
|
|
||||||
|
package html
|
||||||
|
|
||||||
|
import (
|
||||||
|
"strings"
|
||||||
|
)
|
||||||
|
|
||||||
|
// parseDoctype parses the data from a DoctypeToken into a name,
|
||||||
|
// public identifier, and system identifier. It returns a Node whose Type
|
||||||
|
// is DoctypeNode, whose Data is the name, and which has attributes
|
||||||
|
// named "system" and "public" for the two identifiers if they were present.
|
||||||
|
// quirks is whether the document should be parsed in "quirks mode".
|
||||||
|
func parseDoctype(s string) (n *Node, quirks bool) {
|
||||||
|
n = &Node{Type: DoctypeNode}
|
||||||
|
|
||||||
|
// Find the name.
|
||||||
|
space := strings.IndexAny(s, whitespace)
|
||||||
|
if space == -1 {
|
||||||
|
space = len(s)
|
||||||
|
}
|
||||||
|
n.Data = s[:space]
|
||||||
|
// The comparison to "html" is case-sensitive.
|
||||||
|
if n.Data != "html" {
|
||||||
|
quirks = true
|
||||||
|
}
|
||||||
|
n.Data = strings.ToLower(n.Data)
|
||||||
|
s = strings.TrimLeft(s[space:], whitespace)
|
||||||
|
|
||||||
|
if len(s) < 6 {
|
||||||
|
// It can't start with "PUBLIC" or "SYSTEM".
|
||||||
|
// Ignore the rest of the string.
|
||||||
|
return n, quirks || s != ""
|
||||||
|
}
|
||||||
|
|
||||||
|
key := strings.ToLower(s[:6])
|
||||||
|
s = s[6:]
|
||||||
|
for key == "public" || key == "system" {
|
||||||
|
s = strings.TrimLeft(s, whitespace)
|
||||||
|
if s == "" {
|
||||||
|
break
|
||||||
|
}
|
||||||
|
quote := s[0]
|
||||||
|
if quote != '"' && quote != '\'' {
|
||||||
|
break
|
||||||
|
}
|
||||||
|
s = s[1:]
|
||||||
|
q := strings.IndexRune(s, rune(quote))
|
||||||
|
var id string
|
||||||
|
if q == -1 {
|
||||||
|
id = s
|
||||||
|
s = ""
|
||||||
|
} else {
|
||||||
|
id = s[:q]
|
||||||
|
s = s[q+1:]
|
||||||
|
}
|
||||||
|
n.Attr = append(n.Attr, Attribute{Key: key, Val: id})
|
||||||
|
if key == "public" {
|
||||||
|
key = "system"
|
||||||
|
} else {
|
||||||
|
key = ""
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
if key != "" || s != "" {
|
||||||
|
quirks = true
|
||||||
|
} else if len(n.Attr) > 0 {
|
||||||
|
if n.Attr[0].Key == "public" {
|
||||||
|
public := strings.ToLower(n.Attr[0].Val)
|
||||||
|
switch public {
|
||||||
|
case "-//w3o//dtd w3 html strict 3.0//en//", "-/w3d/dtd html 4.0 transitional/en", "html":
|
||||||
|
quirks = true
|
||||||
|
default:
|
||||||
|
for _, q := range quirkyIDs {
|
||||||
|
if strings.HasPrefix(public, q) {
|
||||||
|
quirks = true
|
||||||
|
break
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
// The following two public IDs only cause quirks mode if there is no system ID.
|
||||||
|
if len(n.Attr) == 1 && (strings.HasPrefix(public, "-//w3c//dtd html 4.01 frameset//") ||
|
||||||
|
strings.HasPrefix(public, "-//w3c//dtd html 4.01 transitional//")) {
|
||||||
|
quirks = true
|
||||||
|
}
|
||||||
|
}
|
||||||
|
if lastAttr := n.Attr[len(n.Attr)-1]; lastAttr.Key == "system" &&
|
||||||
|
strings.ToLower(lastAttr.Val) == "http://www.ibm.com/data/dtd/v11/ibmxhtml1-transitional.dtd" {
|
||||||
|
quirks = true
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
return n, quirks
|
||||||
|
}
|
||||||
|
|
||||||
|
// quirkyIDs is a list of public doctype identifiers that cause a document
|
||||||
|
// to be interpreted in quirks mode. The identifiers should be in lower case.
|
||||||
|
var quirkyIDs = []string{
|
||||||
|
"+//silmaril//dtd html pro v0r11 19970101//",
|
||||||
|
"-//advasoft ltd//dtd html 3.0 aswedit + extensions//",
|
||||||
|
"-//as//dtd html 3.0 aswedit + extensions//",
|
||||||
|
"-//ietf//dtd html 2.0 level 1//",
|
||||||
|
"-//ietf//dtd html 2.0 level 2//",
|
||||||
|
"-//ietf//dtd html 2.0 strict level 1//",
|
||||||
|
"-//ietf//dtd html 2.0 strict level 2//",
|
||||||
|
"-//ietf//dtd html 2.0 strict//",
|
||||||
|
"-//ietf//dtd html 2.0//",
|
||||||
|
"-//ietf//dtd html 2.1e//",
|
||||||
|
"-//ietf//dtd html 3.0//",
|
||||||
|
"-//ietf//dtd html 3.2 final//",
|
||||||
|
"-//ietf//dtd html 3.2//",
|
||||||
|
"-//ietf//dtd html 3//",
|
||||||
|
"-//ietf//dtd html level 0//",
|
||||||
|
"-//ietf//dtd html level 1//",
|
||||||
|
"-//ietf//dtd html level 2//",
|
||||||
|
"-//ietf//dtd html level 3//",
|
||||||
|
"-//ietf//dtd html strict level 0//",
|
||||||
|
"-//ietf//dtd html strict level 1//",
|
||||||
|
"-//ietf//dtd html strict level 2//",
|
||||||
|
"-//ietf//dtd html strict level 3//",
|
||||||
|
"-//ietf//dtd html strict//",
|
||||||
|
"-//ietf//dtd html//",
|
||||||
|
"-//metrius//dtd metrius presentational//",
|
||||||
|
"-//microsoft//dtd internet explorer 2.0 html strict//",
|
||||||
|
"-//microsoft//dtd internet explorer 2.0 html//",
|
||||||
|
"-//microsoft//dtd internet explorer 2.0 tables//",
|
||||||
|
"-//microsoft//dtd internet explorer 3.0 html strict//",
|
||||||
|
"-//microsoft//dtd internet explorer 3.0 html//",
|
||||||
|
"-//microsoft//dtd internet explorer 3.0 tables//",
|
||||||
|
"-//netscape comm. corp.//dtd html//",
|
||||||
|
"-//netscape comm. corp.//dtd strict html//",
|
||||||
|
"-//o'reilly and associates//dtd html 2.0//",
|
||||||
|
"-//o'reilly and associates//dtd html extended 1.0//",
|
||||||
|
"-//o'reilly and associates//dtd html extended relaxed 1.0//",
|
||||||
|
"-//softquad software//dtd hotmetal pro 6.0::19990601::extensions to html 4.0//",
|
||||||
|
"-//softquad//dtd hotmetal pro 4.0::19971010::extensions to html 4.0//",
|
||||||
|
"-//spyglass//dtd html 2.0 extended//",
|
||||||
|
"-//sq//dtd html 2.0 hotmetal + extensions//",
|
||||||
|
"-//sun microsystems corp.//dtd hotjava html//",
|
||||||
|
"-//sun microsystems corp.//dtd hotjava strict html//",
|
||||||
|
"-//w3c//dtd html 3 1995-03-24//",
|
||||||
|
"-//w3c//dtd html 3.2 draft//",
|
||||||
|
"-//w3c//dtd html 3.2 final//",
|
||||||
|
"-//w3c//dtd html 3.2//",
|
||||||
|
"-//w3c//dtd html 3.2s draft//",
|
||||||
|
"-//w3c//dtd html 4.0 frameset//",
|
||||||
|
"-//w3c//dtd html 4.0 transitional//",
|
||||||
|
"-//w3c//dtd html experimental 19960712//",
|
||||||
|
"-//w3c//dtd html experimental 970421//",
|
||||||
|
"-//w3c//dtd w3 html//",
|
||||||
|
"-//w3o//dtd w3 html 3.0//",
|
||||||
|
"-//webtechs//dtd mozilla html 2.0//",
|
||||||
|
"-//webtechs//dtd mozilla html//",
|
||||||
|
}
|
2253
vendor/golang.org/x/net/html/entity.go
generated
vendored
Normal file
2253
vendor/golang.org/x/net/html/entity.go
generated
vendored
Normal file
File diff suppressed because it is too large
Load diff
258
vendor/golang.org/x/net/html/escape.go
generated
vendored
Normal file
258
vendor/golang.org/x/net/html/escape.go
generated
vendored
Normal file
|
@ -0,0 +1,258 @@
|
||||||
|
// Copyright 2010 The Go Authors. All rights reserved.
|
||||||
|
// Use of this source code is governed by a BSD-style
|
||||||
|
// license that can be found in the LICENSE file.
|
||||||
|
|
||||||
|
package html
|
||||||
|
|
||||||
|
import (
|
||||||
|
"bytes"
|
||||||
|
"strings"
|
||||||
|
"unicode/utf8"
|
||||||
|
)
|
||||||
|
|
||||||
|
// These replacements permit compatibility with old numeric entities that
|
||||||
|
// assumed Windows-1252 encoding.
|
||||||
|
// https://html.spec.whatwg.org/multipage/syntax.html#consume-a-character-reference
|
||||||
|
var replacementTable = [...]rune{
|
||||||
|
'\u20AC', // First entry is what 0x80 should be replaced with.
|
||||||
|
'\u0081',
|
||||||
|
'\u201A',
|
||||||
|
'\u0192',
|
||||||
|
'\u201E',
|
||||||
|
'\u2026',
|
||||||
|
'\u2020',
|
||||||
|
'\u2021',
|
||||||
|
'\u02C6',
|
||||||
|
'\u2030',
|
||||||
|
'\u0160',
|
||||||
|
'\u2039',
|
||||||
|
'\u0152',
|
||||||
|
'\u008D',
|
||||||
|
'\u017D',
|
||||||
|
'\u008F',
|
||||||
|
'\u0090',
|
||||||
|
'\u2018',
|
||||||
|
'\u2019',
|
||||||
|
'\u201C',
|
||||||
|
'\u201D',
|
||||||
|
'\u2022',
|
||||||
|
'\u2013',
|
||||||
|
'\u2014',
|
||||||
|
'\u02DC',
|
||||||
|
'\u2122',
|
||||||
|
'\u0161',
|
||||||
|
'\u203A',
|
||||||
|
'\u0153',
|
||||||
|
'\u009D',
|
||||||
|
'\u017E',
|
||||||
|
'\u0178', // Last entry is 0x9F.
|
||||||
|
// 0x00->'\uFFFD' is handled programmatically.
|
||||||
|
// 0x0D->'\u000D' is a no-op.
|
||||||
|
}
|
||||||
|
|
||||||
|
// unescapeEntity reads an entity like "<" from b[src:] and writes the
|
||||||
|
// corresponding "<" to b[dst:], returning the incremented dst and src cursors.
|
||||||
|
// Precondition: b[src] == '&' && dst <= src.
|
||||||
|
// attribute should be true if parsing an attribute value.
|
||||||
|
func unescapeEntity(b []byte, dst, src int, attribute bool) (dst1, src1 int) {
|
||||||
|
// https://html.spec.whatwg.org/multipage/syntax.html#consume-a-character-reference
|
||||||
|
|
||||||
|
// i starts at 1 because we already know that s[0] == '&'.
|
||||||
|
i, s := 1, b[src:]
|
||||||
|
|
||||||
|
if len(s) <= 1 {
|
||||||
|
b[dst] = b[src]
|
||||||
|
return dst + 1, src + 1
|
||||||
|
}
|
||||||
|
|
||||||
|
if s[i] == '#' {
|
||||||
|
if len(s) <= 3 { // We need to have at least "&#.".
|
||||||
|
b[dst] = b[src]
|
||||||
|
return dst + 1, src + 1
|
||||||
|
}
|
||||||
|
i++
|
||||||
|
c := s[i]
|
||||||
|
hex := false
|
||||||
|
if c == 'x' || c == 'X' {
|
||||||
|
hex = true
|
||||||
|
i++
|
||||||
|
}
|
||||||
|
|
||||||
|
x := '\x00'
|
||||||
|
for i < len(s) {
|
||||||
|
c = s[i]
|
||||||
|
i++
|
||||||
|
if hex {
|
||||||
|
if '0' <= c && c <= '9' {
|
||||||
|
x = 16*x + rune(c) - '0'
|
||||||
|
continue
|
||||||
|
} else if 'a' <= c && c <= 'f' {
|
||||||
|
x = 16*x + rune(c) - 'a' + 10
|
||||||
|
continue
|
||||||
|
} else if 'A' <= c && c <= 'F' {
|
||||||
|
x = 16*x + rune(c) - 'A' + 10
|
||||||
|
continue
|
||||||
|
}
|
||||||
|
} else if '0' <= c && c <= '9' {
|
||||||
|
x = 10*x + rune(c) - '0'
|
||||||
|
continue
|
||||||
|
}
|
||||||
|
if c != ';' {
|
||||||
|
i--
|
||||||
|
}
|
||||||
|
break
|
||||||
|
}
|
||||||
|
|
||||||
|
if i <= 3 { // No characters matched.
|
||||||
|
b[dst] = b[src]
|
||||||
|
return dst + 1, src + 1
|
||||||
|
}
|
||||||
|
|
||||||
|
if 0x80 <= x && x <= 0x9F {
|
||||||
|
// Replace characters from Windows-1252 with UTF-8 equivalents.
|
||||||
|
x = replacementTable[x-0x80]
|
||||||
|
} else if x == 0 || (0xD800 <= x && x <= 0xDFFF) || x > 0x10FFFF {
|
||||||
|
// Replace invalid characters with the replacement character.
|
||||||
|
x = '\uFFFD'
|
||||||
|
}
|
||||||
|
|
||||||
|
return dst + utf8.EncodeRune(b[dst:], x), src + i
|
||||||
|
}
|
||||||
|
|
||||||
|
// Consume the maximum number of characters possible, with the
|
||||||
|
// consumed characters matching one of the named references.
|
||||||
|
|
||||||
|
for i < len(s) {
|
||||||
|
c := s[i]
|
||||||
|
i++
|
||||||
|
// Lower-cased characters are more common in entities, so we check for them first.
|
||||||
|
if 'a' <= c && c <= 'z' || 'A' <= c && c <= 'Z' || '0' <= c && c <= '9' {
|
||||||
|
continue
|
||||||
|
}
|
||||||
|
if c != ';' {
|
||||||
|
i--
|
||||||
|
}
|
||||||
|
break
|
||||||
|
}
|
||||||
|
|
||||||
|
entityName := string(s[1:i])
|
||||||
|
if entityName == "" {
|
||||||
|
// No-op.
|
||||||
|
} else if attribute && entityName[len(entityName)-1] != ';' && len(s) > i && s[i] == '=' {
|
||||||
|
// No-op.
|
||||||
|
} else if x := entity[entityName]; x != 0 {
|
||||||
|
return dst + utf8.EncodeRune(b[dst:], x), src + i
|
||||||
|
} else if x := entity2[entityName]; x[0] != 0 {
|
||||||
|
dst1 := dst + utf8.EncodeRune(b[dst:], x[0])
|
||||||
|
return dst1 + utf8.EncodeRune(b[dst1:], x[1]), src + i
|
||||||
|
} else if !attribute {
|
||||||
|
maxLen := len(entityName) - 1
|
||||||
|
if maxLen > longestEntityWithoutSemicolon {
|
||||||
|
maxLen = longestEntityWithoutSemicolon
|
||||||
|
}
|
||||||
|
for j := maxLen; j > 1; j-- {
|
||||||
|
if x := entity[entityName[:j]]; x != 0 {
|
||||||
|
return dst + utf8.EncodeRune(b[dst:], x), src + j + 1
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
dst1, src1 = dst+i, src+i
|
||||||
|
copy(b[dst:dst1], b[src:src1])
|
||||||
|
return dst1, src1
|
||||||
|
}
|
||||||
|
|
||||||
|
// unescape unescapes b's entities in-place, so that "a<b" becomes "a<b".
|
||||||
|
// attribute should be true if parsing an attribute value.
|
||||||
|
func unescape(b []byte, attribute bool) []byte {
|
||||||
|
for i, c := range b {
|
||||||
|
if c == '&' {
|
||||||
|
dst, src := unescapeEntity(b, i, i, attribute)
|
||||||
|
for src < len(b) {
|
||||||
|
c := b[src]
|
||||||
|
if c == '&' {
|
||||||
|
dst, src = unescapeEntity(b, dst, src, attribute)
|
||||||
|
} else {
|
||||||
|
b[dst] = c
|
||||||
|
dst, src = dst+1, src+1
|
||||||
|
}
|
||||||
|
}
|
||||||
|
return b[0:dst]
|
||||||
|
}
|
||||||
|
}
|
||||||
|
return b
|
||||||
|
}
|
||||||
|
|
||||||
|
// lower lower-cases the A-Z bytes in b in-place, so that "aBc" becomes "abc".
|
||||||
|
func lower(b []byte) []byte {
|
||||||
|
for i, c := range b {
|
||||||
|
if 'A' <= c && c <= 'Z' {
|
||||||
|
b[i] = c + 'a' - 'A'
|
||||||
|
}
|
||||||
|
}
|
||||||
|
return b
|
||||||
|
}
|
||||||
|
|
||||||
|
const escapedChars = "&'<>\"\r"
|
||||||
|
|
||||||
|
func escape(w writer, s string) error {
|
||||||
|
i := strings.IndexAny(s, escapedChars)
|
||||||
|
for i != -1 {
|
||||||
|
if _, err := w.WriteString(s[:i]); err != nil {
|
||||||
|
return err
|
||||||
|
}
|
||||||
|
var esc string
|
||||||
|
switch s[i] {
|
||||||
|
case '&':
|
||||||
|
esc = "&"
|
||||||
|
case '\'':
|
||||||
|
// "'" is shorter than "'" and apos was not in HTML until HTML5.
|
||||||
|
esc = "'"
|
||||||
|
case '<':
|
||||||
|
esc = "<"
|
||||||
|
case '>':
|
||||||
|
esc = ">"
|
||||||
|
case '"':
|
||||||
|
// """ is shorter than """.
|
||||||
|
esc = """
|
||||||
|
case '\r':
|
||||||
|
esc = " "
|
||||||
|
default:
|
||||||
|
panic("unrecognized escape character")
|
||||||
|
}
|
||||||
|
s = s[i+1:]
|
||||||
|
if _, err := w.WriteString(esc); err != nil {
|
||||||
|
return err
|
||||||
|
}
|
||||||
|
i = strings.IndexAny(s, escapedChars)
|
||||||
|
}
|
||||||
|
_, err := w.WriteString(s)
|
||||||
|
return err
|
||||||
|
}
|
||||||
|
|
||||||
|
// EscapeString escapes special characters like "<" to become "<". It
|
||||||
|
// escapes only five such characters: <, >, &, ' and ".
|
||||||
|
// UnescapeString(EscapeString(s)) == s always holds, but the converse isn't
|
||||||
|
// always true.
|
||||||
|
func EscapeString(s string) string {
|
||||||
|
if strings.IndexAny(s, escapedChars) == -1 {
|
||||||
|
return s
|
||||||
|
}
|
||||||
|
var buf bytes.Buffer
|
||||||
|
escape(&buf, s)
|
||||||
|
return buf.String()
|
||||||
|
}
|
||||||
|
|
||||||
|
// UnescapeString unescapes entities like "<" to become "<". It unescapes a
|
||||||
|
// larger range of entities than EscapeString escapes. For example, "á"
|
||||||
|
// unescapes to "á", as does "á" and "&xE1;".
|
||||||
|
// UnescapeString(EscapeString(s)) == s always holds, but the converse isn't
|
||||||
|
// always true.
|
||||||
|
func UnescapeString(s string) string {
|
||||||
|
for _, c := range s {
|
||||||
|
if c == '&' {
|
||||||
|
return string(unescape([]byte(s), false))
|
||||||
|
}
|
||||||
|
}
|
||||||
|
return s
|
||||||
|
}
|
226
vendor/golang.org/x/net/html/foreign.go
generated
vendored
Normal file
226
vendor/golang.org/x/net/html/foreign.go
generated
vendored
Normal file
|
@ -0,0 +1,226 @@
|
||||||
|
// Copyright 2011 The Go Authors. All rights reserved.
|
||||||
|
// Use of this source code is governed by a BSD-style
|
||||||
|
// license that can be found in the LICENSE file.
|
||||||
|
|
||||||
|
package html
|
||||||
|
|
||||||
|
import (
|
||||||
|
"strings"
|
||||||
|
)
|
||||||
|
|
||||||
|
func adjustAttributeNames(aa []Attribute, nameMap map[string]string) {
|
||||||
|
for i := range aa {
|
||||||
|
if newName, ok := nameMap[aa[i].Key]; ok {
|
||||||
|
aa[i].Key = newName
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
func adjustForeignAttributes(aa []Attribute) {
|
||||||
|
for i, a := range aa {
|
||||||
|
if a.Key == "" || a.Key[0] != 'x' {
|
||||||
|
continue
|
||||||
|
}
|
||||||
|
switch a.Key {
|
||||||
|
case "xlink:actuate", "xlink:arcrole", "xlink:href", "xlink:role", "xlink:show",
|
||||||
|
"xlink:title", "xlink:type", "xml:base", "xml:lang", "xml:space", "xmlns:xlink":
|
||||||
|
j := strings.Index(a.Key, ":")
|
||||||
|
aa[i].Namespace = a.Key[:j]
|
||||||
|
aa[i].Key = a.Key[j+1:]
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
func htmlIntegrationPoint(n *Node) bool {
|
||||||
|
if n.Type != ElementNode {
|
||||||
|
return false
|
||||||
|
}
|
||||||
|
switch n.Namespace {
|
||||||
|
case "math":
|
||||||
|
if n.Data == "annotation-xml" {
|
||||||
|
for _, a := range n.Attr {
|
||||||
|
if a.Key == "encoding" {
|
||||||
|
val := strings.ToLower(a.Val)
|
||||||
|
if val == "text/html" || val == "application/xhtml+xml" {
|
||||||
|
return true
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
case "svg":
|
||||||
|
switch n.Data {
|
||||||
|
case "desc", "foreignObject", "title":
|
||||||
|
return true
|
||||||
|
}
|
||||||
|
}
|
||||||
|
return false
|
||||||
|
}
|
||||||
|
|
||||||
|
func mathMLTextIntegrationPoint(n *Node) bool {
|
||||||
|
if n.Namespace != "math" {
|
||||||
|
return false
|
||||||
|
}
|
||||||
|
switch n.Data {
|
||||||
|
case "mi", "mo", "mn", "ms", "mtext":
|
||||||
|
return true
|
||||||
|
}
|
||||||
|
return false
|
||||||
|
}
|
||||||
|
|
||||||
|
// Section 12.2.6.5.
|
||||||
|
var breakout = map[string]bool{
|
||||||
|
"b": true,
|
||||||
|
"big": true,
|
||||||
|
"blockquote": true,
|
||||||
|
"body": true,
|
||||||
|
"br": true,
|
||||||
|
"center": true,
|
||||||
|
"code": true,
|
||||||
|
"dd": true,
|
||||||
|
"div": true,
|
||||||
|
"dl": true,
|
||||||
|
"dt": true,
|
||||||
|
"em": true,
|
||||||
|
"embed": true,
|
||||||
|
"h1": true,
|
||||||
|
"h2": true,
|
||||||
|
"h3": true,
|
||||||
|
"h4": true,
|
||||||
|
"h5": true,
|
||||||
|
"h6": true,
|
||||||
|
"head": true,
|
||||||
|
"hr": true,
|
||||||
|
"i": true,
|
||||||
|
"img": true,
|
||||||
|
"li": true,
|
||||||
|
"listing": true,
|
||||||
|
"menu": true,
|
||||||
|
"meta": true,
|
||||||
|
"nobr": true,
|
||||||
|
"ol": true,
|
||||||
|
"p": true,
|
||||||
|
"pre": true,
|
||||||
|
"ruby": true,
|
||||||
|
"s": true,
|
||||||
|
"small": true,
|
||||||
|
"span": true,
|
||||||
|
"strong": true,
|
||||||
|
"strike": true,
|
||||||
|
"sub": true,
|
||||||
|
"sup": true,
|
||||||
|
"table": true,
|
||||||
|
"tt": true,
|
||||||
|
"u": true,
|
||||||
|
"ul": true,
|
||||||
|
"var": true,
|
||||||
|
}
|
||||||
|
|
||||||
|
// Section 12.2.6.5.
|
||||||
|
var svgTagNameAdjustments = map[string]string{
|
||||||
|
"altglyph": "altGlyph",
|
||||||
|
"altglyphdef": "altGlyphDef",
|
||||||
|
"altglyphitem": "altGlyphItem",
|
||||||
|
"animatecolor": "animateColor",
|
||||||
|
"animatemotion": "animateMotion",
|
||||||
|
"animatetransform": "animateTransform",
|
||||||
|
"clippath": "clipPath",
|
||||||
|
"feblend": "feBlend",
|
||||||
|
"fecolormatrix": "feColorMatrix",
|
||||||
|
"fecomponenttransfer": "feComponentTransfer",
|
||||||
|
"fecomposite": "feComposite",
|
||||||
|
"feconvolvematrix": "feConvolveMatrix",
|
||||||
|
"fediffuselighting": "feDiffuseLighting",
|
||||||
|
"fedisplacementmap": "feDisplacementMap",
|
||||||
|
"fedistantlight": "feDistantLight",
|
||||||
|
"feflood": "feFlood",
|
||||||
|
"fefunca": "feFuncA",
|
||||||
|
"fefuncb": "feFuncB",
|
||||||
|
"fefuncg": "feFuncG",
|
||||||
|
"fefuncr": "feFuncR",
|
||||||
|
"fegaussianblur": "feGaussianBlur",
|
||||||
|
"feimage": "feImage",
|
||||||
|
"femerge": "feMerge",
|
||||||
|
"femergenode": "feMergeNode",
|
||||||
|
"femorphology": "feMorphology",
|
||||||
|
"feoffset": "feOffset",
|
||||||
|
"fepointlight": "fePointLight",
|
||||||
|
"fespecularlighting": "feSpecularLighting",
|
||||||
|
"fespotlight": "feSpotLight",
|
||||||
|
"fetile": "feTile",
|
||||||
|
"feturbulence": "feTurbulence",
|
||||||
|
"foreignobject": "foreignObject",
|
||||||
|
"glyphref": "glyphRef",
|
||||||
|
"lineargradient": "linearGradient",
|
||||||
|
"radialgradient": "radialGradient",
|
||||||
|
"textpath": "textPath",
|
||||||
|
}
|
||||||
|
|
||||||
|
// Section 12.2.6.1
|
||||||
|
var mathMLAttributeAdjustments = map[string]string{
|
||||||
|
"definitionurl": "definitionURL",
|
||||||
|
}
|
||||||
|
|
||||||
|
var svgAttributeAdjustments = map[string]string{
|
||||||
|
"attributename": "attributeName",
|
||||||
|
"attributetype": "attributeType",
|
||||||
|
"basefrequency": "baseFrequency",
|
||||||
|
"baseprofile": "baseProfile",
|
||||||
|
"calcmode": "calcMode",
|
||||||
|
"clippathunits": "clipPathUnits",
|
||||||
|
"contentscripttype": "contentScriptType",
|
||||||
|
"contentstyletype": "contentStyleType",
|
||||||
|
"diffuseconstant": "diffuseConstant",
|
||||||
|
"edgemode": "edgeMode",
|
||||||
|
"externalresourcesrequired": "externalResourcesRequired",
|
||||||
|
"filterres": "filterRes",
|
||||||
|
"filterunits": "filterUnits",
|
||||||
|
"glyphref": "glyphRef",
|
||||||
|
"gradienttransform": "gradientTransform",
|
||||||
|
"gradientunits": "gradientUnits",
|
||||||
|
"kernelmatrix": "kernelMatrix",
|
||||||
|
"kernelunitlength": "kernelUnitLength",
|
||||||
|
"keypoints": "keyPoints",
|
||||||
|
"keysplines": "keySplines",
|
||||||
|
"keytimes": "keyTimes",
|
||||||
|
"lengthadjust": "lengthAdjust",
|
||||||
|
"limitingconeangle": "limitingConeAngle",
|
||||||
|
"markerheight": "markerHeight",
|
||||||
|
"markerunits": "markerUnits",
|
||||||
|
"markerwidth": "markerWidth",
|
||||||
|
"maskcontentunits": "maskContentUnits",
|
||||||
|
"maskunits": "maskUnits",
|
||||||
|
"numoctaves": "numOctaves",
|
||||||
|
"pathlength": "pathLength",
|
||||||
|
"patterncontentunits": "patternContentUnits",
|
||||||
|
"patterntransform": "patternTransform",
|
||||||
|
"patternunits": "patternUnits",
|
||||||
|
"pointsatx": "pointsAtX",
|
||||||
|
"pointsaty": "pointsAtY",
|
||||||
|
"pointsatz": "pointsAtZ",
|
||||||
|
"preservealpha": "preserveAlpha",
|
||||||
|
"preserveaspectratio": "preserveAspectRatio",
|
||||||
|
"primitiveunits": "primitiveUnits",
|
||||||
|
"refx": "refX",
|
||||||
|
"refy": "refY",
|
||||||
|
"repeatcount": "repeatCount",
|
||||||
|
"repeatdur": "repeatDur",
|
||||||
|
"requiredextensions": "requiredExtensions",
|
||||||
|
"requiredfeatures": "requiredFeatures",
|
||||||
|
"specularconstant": "specularConstant",
|
||||||
|
"specularexponent": "specularExponent",
|
||||||
|
"spreadmethod": "spreadMethod",
|
||||||
|
"startoffset": "startOffset",
|
||||||
|
"stddeviation": "stdDeviation",
|
||||||
|
"stitchtiles": "stitchTiles",
|
||||||
|
"surfacescale": "surfaceScale",
|
||||||
|
"systemlanguage": "systemLanguage",
|
||||||
|
"tablevalues": "tableValues",
|
||||||
|
"targetx": "targetX",
|
||||||
|
"targety": "targetY",
|
||||||
|
"textlength": "textLength",
|
||||||
|
"viewbox": "viewBox",
|
||||||
|
"viewtarget": "viewTarget",
|
||||||
|
"xchannelselector": "xChannelSelector",
|
||||||
|
"ychannelselector": "yChannelSelector",
|
||||||
|
"zoomandpan": "zoomAndPan",
|
||||||
|
}
|
220
vendor/golang.org/x/net/html/node.go
generated
vendored
Normal file
220
vendor/golang.org/x/net/html/node.go
generated
vendored
Normal file
|
@ -0,0 +1,220 @@
|
||||||
|
// Copyright 2011 The Go Authors. All rights reserved.
|
||||||
|
// Use of this source code is governed by a BSD-style
|
||||||
|
// license that can be found in the LICENSE file.
|
||||||
|
|
||||||
|
package html
|
||||||
|
|
||||||
|
import (
|
||||||
|
"golang.org/x/net/html/atom"
|
||||||
|
)
|
||||||
|
|
||||||
|
// A NodeType is the type of a Node.
|
||||||
|
type NodeType uint32
|
||||||
|
|
||||||
|
const (
|
||||||
|
ErrorNode NodeType = iota
|
||||||
|
TextNode
|
||||||
|
DocumentNode
|
||||||
|
ElementNode
|
||||||
|
CommentNode
|
||||||
|
DoctypeNode
|
||||||
|
scopeMarkerNode
|
||||||
|
)
|
||||||
|
|
||||||
|
// Section 12.2.4.3 says "The markers are inserted when entering applet,
|
||||||
|
// object, marquee, template, td, th, and caption elements, and are used
|
||||||
|
// to prevent formatting from "leaking" into applet, object, marquee,
|
||||||
|
// template, td, th, and caption elements".
|
||||||
|
var scopeMarker = Node{Type: scopeMarkerNode}
|
||||||
|
|
||||||
|
// A Node consists of a NodeType and some Data (tag name for element nodes,
|
||||||
|
// content for text) and are part of a tree of Nodes. Element nodes may also
|
||||||
|
// have a Namespace and contain a slice of Attributes. Data is unescaped, so
|
||||||
|
// that it looks like "a<b" rather than "a<b". For element nodes, DataAtom
|
||||||
|
// is the atom for Data, or zero if Data is not a known tag name.
|
||||||
|
//
|
||||||
|
// An empty Namespace implies a "http://www.w3.org/1999/xhtml" namespace.
|
||||||
|
// Similarly, "math" is short for "http://www.w3.org/1998/Math/MathML", and
|
||||||
|
// "svg" is short for "http://www.w3.org/2000/svg".
|
||||||
|
type Node struct {
|
||||||
|
Parent, FirstChild, LastChild, PrevSibling, NextSibling *Node
|
||||||
|
|
||||||
|
Type NodeType
|
||||||
|
DataAtom atom.Atom
|
||||||
|
Data string
|
||||||
|
Namespace string
|
||||||
|
Attr []Attribute
|
||||||
|
}
|
||||||
|
|
||||||
|
// InsertBefore inserts newChild as a child of n, immediately before oldChild
|
||||||
|
// in the sequence of n's children. oldChild may be nil, in which case newChild
|
||||||
|
// is appended to the end of n's children.
|
||||||
|
//
|
||||||
|
// It will panic if newChild already has a parent or siblings.
|
||||||
|
func (n *Node) InsertBefore(newChild, oldChild *Node) {
|
||||||
|
if newChild.Parent != nil || newChild.PrevSibling != nil || newChild.NextSibling != nil {
|
||||||
|
panic("html: InsertBefore called for an attached child Node")
|
||||||
|
}
|
||||||
|
var prev, next *Node
|
||||||
|
if oldChild != nil {
|
||||||
|
prev, next = oldChild.PrevSibling, oldChild
|
||||||
|
} else {
|
||||||
|
prev = n.LastChild
|
||||||
|
}
|
||||||
|
if prev != nil {
|
||||||
|
prev.NextSibling = newChild
|
||||||
|
} else {
|
||||||
|
n.FirstChild = newChild
|
||||||
|
}
|
||||||
|
if next != nil {
|
||||||
|
next.PrevSibling = newChild
|
||||||
|
} else {
|
||||||
|
n.LastChild = newChild
|
||||||
|
}
|
||||||
|
newChild.Parent = n
|
||||||
|
newChild.PrevSibling = prev
|
||||||
|
newChild.NextSibling = next
|
||||||
|
}
|
||||||
|
|
||||||
|
// AppendChild adds a node c as a child of n.
|
||||||
|
//
|
||||||
|
// It will panic if c already has a parent or siblings.
|
||||||
|
func (n *Node) AppendChild(c *Node) {
|
||||||
|
if c.Parent != nil || c.PrevSibling != nil || c.NextSibling != nil {
|
||||||
|
panic("html: AppendChild called for an attached child Node")
|
||||||
|
}
|
||||||
|
last := n.LastChild
|
||||||
|
if last != nil {
|
||||||
|
last.NextSibling = c
|
||||||
|
} else {
|
||||||
|
n.FirstChild = c
|
||||||
|
}
|
||||||
|
n.LastChild = c
|
||||||
|
c.Parent = n
|
||||||
|
c.PrevSibling = last
|
||||||
|
}
|
||||||
|
|
||||||
|
// RemoveChild removes a node c that is a child of n. Afterwards, c will have
|
||||||
|
// no parent and no siblings.
|
||||||
|
//
|
||||||
|
// It will panic if c's parent is not n.
|
||||||
|
func (n *Node) RemoveChild(c *Node) {
|
||||||
|
if c.Parent != n {
|
||||||
|
panic("html: RemoveChild called for a non-child Node")
|
||||||
|
}
|
||||||
|
if n.FirstChild == c {
|
||||||
|
n.FirstChild = c.NextSibling
|
||||||
|
}
|
||||||
|
if c.NextSibling != nil {
|
||||||
|
c.NextSibling.PrevSibling = c.PrevSibling
|
||||||
|
}
|
||||||
|
if n.LastChild == c {
|
||||||
|
n.LastChild = c.PrevSibling
|
||||||
|
}
|
||||||
|
if c.PrevSibling != nil {
|
||||||
|
c.PrevSibling.NextSibling = c.NextSibling
|
||||||
|
}
|
||||||
|
c.Parent = nil
|
||||||
|
c.PrevSibling = nil
|
||||||
|
c.NextSibling = nil
|
||||||
|
}
|
||||||
|
|
||||||
|
// reparentChildren reparents all of src's child nodes to dst.
|
||||||
|
func reparentChildren(dst, src *Node) {
|
||||||
|
for {
|
||||||
|
child := src.FirstChild
|
||||||
|
if child == nil {
|
||||||
|
break
|
||||||
|
}
|
||||||
|
src.RemoveChild(child)
|
||||||
|
dst.AppendChild(child)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// clone returns a new node with the same type, data and attributes.
|
||||||
|
// The clone has no parent, no siblings and no children.
|
||||||
|
func (n *Node) clone() *Node {
|
||||||
|
m := &Node{
|
||||||
|
Type: n.Type,
|
||||||
|
DataAtom: n.DataAtom,
|
||||||
|
Data: n.Data,
|
||||||
|
Attr: make([]Attribute, len(n.Attr)),
|
||||||
|
}
|
||||||
|
copy(m.Attr, n.Attr)
|
||||||
|
return m
|
||||||
|
}
|
||||||
|
|
||||||
|
// nodeStack is a stack of nodes.
|
||||||
|
type nodeStack []*Node
|
||||||
|
|
||||||
|
// pop pops the stack. It will panic if s is empty.
|
||||||
|
func (s *nodeStack) pop() *Node {
|
||||||
|
i := len(*s)
|
||||||
|
n := (*s)[i-1]
|
||||||
|
*s = (*s)[:i-1]
|
||||||
|
return n
|
||||||
|
}
|
||||||
|
|
||||||
|
// top returns the most recently pushed node, or nil if s is empty.
|
||||||
|
func (s *nodeStack) top() *Node {
|
||||||
|
if i := len(*s); i > 0 {
|
||||||
|
return (*s)[i-1]
|
||||||
|
}
|
||||||
|
return nil
|
||||||
|
}
|
||||||
|
|
||||||
|
// index returns the index of the top-most occurrence of n in the stack, or -1
|
||||||
|
// if n is not present.
|
||||||
|
func (s *nodeStack) index(n *Node) int {
|
||||||
|
for i := len(*s) - 1; i >= 0; i-- {
|
||||||
|
if (*s)[i] == n {
|
||||||
|
return i
|
||||||
|
}
|
||||||
|
}
|
||||||
|
return -1
|
||||||
|
}
|
||||||
|
|
||||||
|
// contains returns whether a is within s.
|
||||||
|
func (s *nodeStack) contains(a atom.Atom) bool {
|
||||||
|
for _, n := range *s {
|
||||||
|
if n.DataAtom == a && n.Namespace == "" {
|
||||||
|
return true
|
||||||
|
}
|
||||||
|
}
|
||||||
|
return false
|
||||||
|
}
|
||||||
|
|
||||||
|
// insert inserts a node at the given index.
|
||||||
|
func (s *nodeStack) insert(i int, n *Node) {
|
||||||
|
(*s) = append(*s, nil)
|
||||||
|
copy((*s)[i+1:], (*s)[i:])
|
||||||
|
(*s)[i] = n
|
||||||
|
}
|
||||||
|
|
||||||
|
// remove removes a node from the stack. It is a no-op if n is not present.
|
||||||
|
func (s *nodeStack) remove(n *Node) {
|
||||||
|
i := s.index(n)
|
||||||
|
if i == -1 {
|
||||||
|
return
|
||||||
|
}
|
||||||
|
copy((*s)[i:], (*s)[i+1:])
|
||||||
|
j := len(*s) - 1
|
||||||
|
(*s)[j] = nil
|
||||||
|
*s = (*s)[:j]
|
||||||
|
}
|
||||||
|
|
||||||
|
type insertionModeStack []insertionMode
|
||||||
|
|
||||||
|
func (s *insertionModeStack) pop() (im insertionMode) {
|
||||||
|
i := len(*s)
|
||||||
|
im = (*s)[i-1]
|
||||||
|
*s = (*s)[:i-1]
|
||||||
|
return im
|
||||||
|
}
|
||||||
|
|
||||||
|
func (s *insertionModeStack) top() insertionMode {
|
||||||
|
if i := len(*s); i > 0 {
|
||||||
|
return (*s)[i-1]
|
||||||
|
}
|
||||||
|
return nil
|
||||||
|
}
|
271
vendor/golang.org/x/net/html/render.go
generated
vendored
Normal file
271
vendor/golang.org/x/net/html/render.go
generated
vendored
Normal file
|
@ -0,0 +1,271 @@
|
||||||
|
// Copyright 2011 The Go Authors. All rights reserved.
|
||||||
|
// Use of this source code is governed by a BSD-style
|
||||||
|
// license that can be found in the LICENSE file.
|
||||||
|
|
||||||
|
package html
|
||||||
|
|
||||||
|
import (
|
||||||
|
"bufio"
|
||||||
|
"errors"
|
||||||
|
"fmt"
|
||||||
|
"io"
|
||||||
|
"strings"
|
||||||
|
)
|
||||||
|
|
||||||
|
type writer interface {
|
||||||
|
io.Writer
|
||||||
|
io.ByteWriter
|
||||||
|
WriteString(string) (int, error)
|
||||||
|
}
|
||||||
|
|
||||||
|
// Render renders the parse tree n to the given writer.
|
||||||
|
//
|
||||||
|
// Rendering is done on a 'best effort' basis: calling Parse on the output of
|
||||||
|
// Render will always result in something similar to the original tree, but it
|
||||||
|
// is not necessarily an exact clone unless the original tree was 'well-formed'.
|
||||||
|
// 'Well-formed' is not easily specified; the HTML5 specification is
|
||||||
|
// complicated.
|
||||||
|
//
|
||||||
|
// Calling Parse on arbitrary input typically results in a 'well-formed' parse
|
||||||
|
// tree. However, it is possible for Parse to yield a 'badly-formed' parse tree.
|
||||||
|
// For example, in a 'well-formed' parse tree, no <a> element is a child of
|
||||||
|
// another <a> element: parsing "<a><a>" results in two sibling elements.
|
||||||
|
// Similarly, in a 'well-formed' parse tree, no <a> element is a child of a
|
||||||
|
// <table> element: parsing "<p><table><a>" results in a <p> with two sibling
|
||||||
|
// children; the <a> is reparented to the <table>'s parent. However, calling
|
||||||
|
// Parse on "<a><table><a>" does not return an error, but the result has an <a>
|
||||||
|
// element with an <a> child, and is therefore not 'well-formed'.
|
||||||
|
//
|
||||||
|
// Programmatically constructed trees are typically also 'well-formed', but it
|
||||||
|
// is possible to construct a tree that looks innocuous but, when rendered and
|
||||||
|
// re-parsed, results in a different tree. A simple example is that a solitary
|
||||||
|
// text node would become a tree containing <html>, <head> and <body> elements.
|
||||||
|
// Another example is that the programmatic equivalent of "a<head>b</head>c"
|
||||||
|
// becomes "<html><head><head/><body>abc</body></html>".
|
||||||
|
func Render(w io.Writer, n *Node) error {
|
||||||
|
if x, ok := w.(writer); ok {
|
||||||
|
return render(x, n)
|
||||||
|
}
|
||||||
|
buf := bufio.NewWriter(w)
|
||||||
|
if err := render(buf, n); err != nil {
|
||||||
|
return err
|
||||||
|
}
|
||||||
|
return buf.Flush()
|
||||||
|
}
|
||||||
|
|
||||||
|
// plaintextAbort is returned from render1 when a <plaintext> element
|
||||||
|
// has been rendered. No more end tags should be rendered after that.
|
||||||
|
var plaintextAbort = errors.New("html: internal error (plaintext abort)")
|
||||||
|
|
||||||
|
func render(w writer, n *Node) error {
|
||||||
|
err := render1(w, n)
|
||||||
|
if err == plaintextAbort {
|
||||||
|
err = nil
|
||||||
|
}
|
||||||
|
return err
|
||||||
|
}
|
||||||
|
|
||||||
|
func render1(w writer, n *Node) error {
|
||||||
|
// Render non-element nodes; these are the easy cases.
|
||||||
|
switch n.Type {
|
||||||
|
case ErrorNode:
|
||||||
|
return errors.New("html: cannot render an ErrorNode node")
|
||||||
|
case TextNode:
|
||||||
|
return escape(w, n.Data)
|
||||||
|
case DocumentNode:
|
||||||
|
for c := n.FirstChild; c != nil; c = c.NextSibling {
|
||||||
|
if err := render1(w, c); err != nil {
|
||||||
|
return err
|
||||||
|
}
|
||||||
|
}
|
||||||
|
return nil
|
||||||
|
case ElementNode:
|
||||||
|
// No-op.
|
||||||
|
case CommentNode:
|
||||||
|
if _, err := w.WriteString("<!--"); err != nil {
|
||||||
|
return err
|
||||||
|
}
|
||||||
|
if _, err := w.WriteString(n.Data); err != nil {
|
||||||
|
return err
|
||||||
|
}
|
||||||
|
if _, err := w.WriteString("-->"); err != nil {
|
||||||
|
return err
|
||||||
|
}
|
||||||
|
return nil
|
||||||
|
case DoctypeNode:
|
||||||
|
if _, err := w.WriteString("<!DOCTYPE "); err != nil {
|
||||||
|
return err
|
||||||
|
}
|
||||||
|
if _, err := w.WriteString(n.Data); err != nil {
|
||||||
|
return err
|
||||||
|
}
|
||||||
|
if n.Attr != nil {
|
||||||
|
var p, s string
|
||||||
|
for _, a := range n.Attr {
|
||||||
|
switch a.Key {
|
||||||
|
case "public":
|
||||||
|
p = a.Val
|
||||||
|
case "system":
|
||||||
|
s = a.Val
|
||||||
|
}
|
||||||
|
}
|
||||||
|
if p != "" {
|
||||||
|
if _, err := w.WriteString(" PUBLIC "); err != nil {
|
||||||
|
return err
|
||||||
|
}
|
||||||
|
if err := writeQuoted(w, p); err != nil {
|
||||||
|
return err
|
||||||
|
}
|
||||||
|
if s != "" {
|
||||||
|
if err := w.WriteByte(' '); err != nil {
|
||||||
|
return err
|
||||||
|
}
|
||||||
|
if err := writeQuoted(w, s); err != nil {
|
||||||
|
return err
|
||||||
|
}
|
||||||
|
}
|
||||||
|
} else if s != "" {
|
||||||
|
if _, err := w.WriteString(" SYSTEM "); err != nil {
|
||||||
|
return err
|
||||||
|
}
|
||||||
|
if err := writeQuoted(w, s); err != nil {
|
||||||
|
return err
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
return w.WriteByte('>')
|
||||||
|
default:
|
||||||
|
return errors.New("html: unknown node type")
|
||||||
|
}
|
||||||
|
|
||||||
|
// Render the <xxx> opening tag.
|
||||||
|
if err := w.WriteByte('<'); err != nil {
|
||||||
|
return err
|
||||||
|
}
|
||||||
|
if _, err := w.WriteString(n.Data); err != nil {
|
||||||
|
return err
|
||||||
|
}
|
||||||
|
for _, a := range n.Attr {
|
||||||
|
if err := w.WriteByte(' '); err != nil {
|
||||||
|
return err
|
||||||
|
}
|
||||||
|
if a.Namespace != "" {
|
||||||
|
if _, err := w.WriteString(a.Namespace); err != nil {
|
||||||
|
return err
|
||||||
|
}
|
||||||
|
if err := w.WriteByte(':'); err != nil {
|
||||||
|
return err
|
||||||
|
}
|
||||||
|
}
|
||||||
|
if _, err := w.WriteString(a.Key); err != nil {
|
||||||
|
return err
|
||||||
|
}
|
||||||
|
if _, err := w.WriteString(`="`); err != nil {
|
||||||
|
return err
|
||||||
|
}
|
||||||
|
if err := escape(w, a.Val); err != nil {
|
||||||
|
return err
|
||||||
|
}
|
||||||
|
if err := w.WriteByte('"'); err != nil {
|
||||||
|
return err
|
||||||
|
}
|
||||||
|
}
|
||||||
|
if voidElements[n.Data] {
|
||||||
|
if n.FirstChild != nil {
|
||||||
|
return fmt.Errorf("html: void element <%s> has child nodes", n.Data)
|
||||||
|
}
|
||||||
|
_, err := w.WriteString("/>")
|
||||||
|
return err
|
||||||
|
}
|
||||||
|
if err := w.WriteByte('>'); err != nil {
|
||||||
|
return err
|
||||||
|
}
|
||||||
|
|
||||||
|
// Add initial newline where there is danger of a newline beging ignored.
|
||||||
|
if c := n.FirstChild; c != nil && c.Type == TextNode && strings.HasPrefix(c.Data, "\n") {
|
||||||
|
switch n.Data {
|
||||||
|
case "pre", "listing", "textarea":
|
||||||
|
if err := w.WriteByte('\n'); err != nil {
|
||||||
|
return err
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// Render any child nodes.
|
||||||
|
switch n.Data {
|
||||||
|
case "iframe", "noembed", "noframes", "noscript", "plaintext", "script", "style", "xmp":
|
||||||
|
for c := n.FirstChild; c != nil; c = c.NextSibling {
|
||||||
|
if c.Type == TextNode {
|
||||||
|
if _, err := w.WriteString(c.Data); err != nil {
|
||||||
|
return err
|
||||||
|
}
|
||||||
|
} else {
|
||||||
|
if err := render1(w, c); err != nil {
|
||||||
|
return err
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
if n.Data == "plaintext" {
|
||||||
|
// Don't render anything else. <plaintext> must be the
|
||||||
|
// last element in the file, with no closing tag.
|
||||||
|
return plaintextAbort
|
||||||
|
}
|
||||||
|
default:
|
||||||
|
for c := n.FirstChild; c != nil; c = c.NextSibling {
|
||||||
|
if err := render1(w, c); err != nil {
|
||||||
|
return err
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// Render the </xxx> closing tag.
|
||||||
|
if _, err := w.WriteString("</"); err != nil {
|
||||||
|
return err
|
||||||
|
}
|
||||||
|
if _, err := w.WriteString(n.Data); err != nil {
|
||||||
|
return err
|
||||||
|
}
|
||||||
|
return w.WriteByte('>')
|
||||||
|
}
|
||||||
|
|
||||||
|
// writeQuoted writes s to w surrounded by quotes. Normally it will use double
|
||||||
|
// quotes, but if s contains a double quote, it will use single quotes.
|
||||||
|
// It is used for writing the identifiers in a doctype declaration.
|
||||||
|
// In valid HTML, they can't contain both types of quotes.
|
||||||
|
func writeQuoted(w writer, s string) error {
|
||||||
|
var q byte = '"'
|
||||||
|
if strings.Contains(s, `"`) {
|
||||||
|
q = '\''
|
||||||
|
}
|
||||||
|
if err := w.WriteByte(q); err != nil {
|
||||||
|
return err
|
||||||
|
}
|
||||||
|
if _, err := w.WriteString(s); err != nil {
|
||||||
|
return err
|
||||||
|
}
|
||||||
|
if err := w.WriteByte(q); err != nil {
|
||||||
|
return err
|
||||||
|
}
|
||||||
|
return nil
|
||||||
|
}
|
||||||
|
|
||||||
|
// Section 12.1.2, "Elements", gives this list of void elements. Void elements
|
||||||
|
// are those that can't have any contents.
|
||||||
|
var voidElements = map[string]bool{
|
||||||
|
"area": true,
|
||||||
|
"base": true,
|
||||||
|
"br": true,
|
||||||
|
"col": true,
|
||||||
|
"command": true,
|
||||||
|
"embed": true,
|
||||||
|
"hr": true,
|
||||||
|
"img": true,
|
||||||
|
"input": true,
|
||||||
|
"keygen": true,
|
||||||
|
"link": true,
|
||||||
|
"meta": true,
|
||||||
|
"param": true,
|
||||||
|
"source": true,
|
||||||
|
"track": true,
|
||||||
|
"wbr": true,
|
||||||
|
}
|
1219
vendor/golang.org/x/net/html/token.go
generated
vendored
Normal file
1219
vendor/golang.org/x/net/html/token.go
generated
vendored
Normal file
File diff suppressed because it is too large
Load diff
27
vendor/golang.org/x/text/LICENSE
generated
vendored
Normal file
27
vendor/golang.org/x/text/LICENSE
generated
vendored
Normal file
|
@ -0,0 +1,27 @@
|
||||||
|
Copyright (c) 2009 The Go Authors. All rights reserved.
|
||||||
|
|
||||||
|
Redistribution and use in source and binary forms, with or without
|
||||||
|
modification, are permitted provided that the following conditions are
|
||||||
|
met:
|
||||||
|
|
||||||
|
* Redistributions of source code must retain the above copyright
|
||||||
|
notice, this list of conditions and the following disclaimer.
|
||||||
|
* Redistributions in binary form must reproduce the above
|
||||||
|
copyright notice, this list of conditions and the following disclaimer
|
||||||
|
in the documentation and/or other materials provided with the
|
||||||
|
distribution.
|
||||||
|
* Neither the name of Google Inc. nor the names of its
|
||||||
|
contributors may be used to endorse or promote products derived from
|
||||||
|
this software without specific prior written permission.
|
||||||
|
|
||||||
|
THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
|
||||||
|
"AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT
|
||||||
|
LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR
|
||||||
|
A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT
|
||||||
|
OWNER OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL,
|
||||||
|
SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT
|
||||||
|
LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE,
|
||||||
|
DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY
|
||||||
|
THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT
|
||||||
|
(INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
|
||||||
|
OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
|
22
vendor/golang.org/x/text/PATENTS
generated
vendored
Normal file
22
vendor/golang.org/x/text/PATENTS
generated
vendored
Normal file
|
@ -0,0 +1,22 @@
|
||||||
|
Additional IP Rights Grant (Patents)
|
||||||
|
|
||||||
|
"This implementation" means the copyrightable works distributed by
|
||||||
|
Google as part of the Go project.
|
||||||
|
|
||||||
|
Google hereby grants to You a perpetual, worldwide, non-exclusive,
|
||||||
|
no-charge, royalty-free, irrevocable (except as stated in this section)
|
||||||
|
patent license to make, have made, use, offer to sell, sell, import,
|
||||||
|
transfer and otherwise run, modify and propagate the contents of this
|
||||||
|
implementation of Go, where such license applies only to those patent
|
||||||
|
claims, both currently owned or controlled by Google and acquired in
|
||||||
|
the future, licensable by Google that are necessarily infringed by this
|
||||||
|
implementation of Go. This grant does not include claims that would be
|
||||||
|
infringed only as a consequence of further modification of this
|
||||||
|
implementation. If you or your agent or exclusive licensee institute or
|
||||||
|
order or agree to the institution of patent litigation against any
|
||||||
|
entity (including a cross-claim or counterclaim in a lawsuit) alleging
|
||||||
|
that this implementation of Go or any code incorporated within this
|
||||||
|
implementation of Go constitutes direct or contributory patent
|
||||||
|
infringement, or inducement of patent infringement, then any patent
|
||||||
|
rights granted to you under this License for this implementation of Go
|
||||||
|
shall terminate as of the date such litigation is filed.
|
81
vendor/golang.org/x/text/encoding/internal/identifier/identifier.go
generated
vendored
Normal file
81
vendor/golang.org/x/text/encoding/internal/identifier/identifier.go
generated
vendored
Normal file
|
@ -0,0 +1,81 @@
|
||||||
|
// Copyright 2015 The Go Authors. All rights reserved.
|
||||||
|
// Use of this source code is governed by a BSD-style
|
||||||
|
// license that can be found in the LICENSE file.
|
||||||
|
|
||||||
|
//go:generate go run gen.go
|
||||||
|
|
||||||
|
// Package identifier defines the contract between implementations of Encoding
|
||||||
|
// and Index by defining identifiers that uniquely identify standardized coded
|
||||||
|
// character sets (CCS) and character encoding schemes (CES), which we will
|
||||||
|
// together refer to as encodings, for which Encoding implementations provide
|
||||||
|
// converters to and from UTF-8. This package is typically only of concern to
|
||||||
|
// implementers of Indexes and Encodings.
|
||||||
|
//
|
||||||
|
// One part of the identifier is the MIB code, which is defined by IANA and
|
||||||
|
// uniquely identifies a CCS or CES. Each code is associated with data that
|
||||||
|
// references authorities, official documentation as well as aliases and MIME
|
||||||
|
// names.
|
||||||
|
//
|
||||||
|
// Not all CESs are covered by the IANA registry. The "other" string that is
|
||||||
|
// returned by ID can be used to identify other character sets or versions of
|
||||||
|
// existing ones.
|
||||||
|
//
|
||||||
|
// It is recommended that each package that provides a set of Encodings provide
|
||||||
|
// the All and Common variables to reference all supported encodings and
|
||||||
|
// commonly used subset. This allows Index implementations to include all
|
||||||
|
// available encodings without explicitly referencing or knowing about them.
|
||||||
|
package identifier
|
||||||
|
|
||||||
|
// Note: this package is internal, but could be made public if there is a need
|
||||||
|
// for writing third-party Indexes and Encodings.
|
||||||
|
|
||||||
|
// References:
|
||||||
|
// - http://source.icu-project.org/repos/icu/icu/trunk/source/data/mappings/convrtrs.txt
|
||||||
|
// - http://www.iana.org/assignments/character-sets/character-sets.xhtml
|
||||||
|
// - http://www.iana.org/assignments/ianacharset-mib/ianacharset-mib
|
||||||
|
// - http://www.ietf.org/rfc/rfc2978.txt
|
||||||
|
// - https://www.unicode.org/reports/tr22/
|
||||||
|
// - http://www.w3.org/TR/encoding/
|
||||||
|
// - https://encoding.spec.whatwg.org/
|
||||||
|
// - https://encoding.spec.whatwg.org/encodings.json
|
||||||
|
// - https://tools.ietf.org/html/rfc6657#section-5
|
||||||
|
|
||||||
|
// Interface can be implemented by Encodings to define the CCS or CES for which
|
||||||
|
// it implements conversions.
|
||||||
|
type Interface interface {
|
||||||
|
// ID returns an encoding identifier. Exactly one of the mib and other
|
||||||
|
// values should be non-zero.
|
||||||
|
//
|
||||||
|
// In the usual case it is only necessary to indicate the MIB code. The
|
||||||
|
// other string can be used to specify encodings for which there is no MIB,
|
||||||
|
// such as "x-mac-dingbat".
|
||||||
|
//
|
||||||
|
// The other string may only contain the characters a-z, A-Z, 0-9, - and _.
|
||||||
|
ID() (mib MIB, other string)
|
||||||
|
|
||||||
|
// NOTE: the restrictions on the encoding are to allow extending the syntax
|
||||||
|
// with additional information such as versions, vendors and other variants.
|
||||||
|
}
|
||||||
|
|
||||||
|
// A MIB identifies an encoding. It is derived from the IANA MIB codes and adds
|
||||||
|
// some identifiers for some encodings that are not covered by the IANA
|
||||||
|
// standard.
|
||||||
|
//
|
||||||
|
// See http://www.iana.org/assignments/ianacharset-mib.
|
||||||
|
type MIB uint16
|
||||||
|
|
||||||
|
// These additional MIB types are not defined in IANA. They are added because
|
||||||
|
// they are common and defined within the text repo.
|
||||||
|
const (
|
||||||
|
// Unofficial marks the start of encodings not registered by IANA.
|
||||||
|
Unofficial MIB = 10000 + iota
|
||||||
|
|
||||||
|
// Replacement is the WhatWG replacement encoding.
|
||||||
|
Replacement
|
||||||
|
|
||||||
|
// XUserDefined is the code for x-user-defined.
|
||||||
|
XUserDefined
|
||||||
|
|
||||||
|
// MacintoshCyrillic is the code for x-mac-cyrillic.
|
||||||
|
MacintoshCyrillic
|
||||||
|
)
|
16
vendor/golang.org/x/text/internal/language/common.go
generated
vendored
Normal file
16
vendor/golang.org/x/text/internal/language/common.go
generated
vendored
Normal file
|
@ -0,0 +1,16 @@
|
||||||
|
// Code generated by running "go generate" in golang.org/x/text. DO NOT EDIT.
|
||||||
|
|
||||||
|
package language
|
||||||
|
|
||||||
|
// This file contains code common to the maketables.go and the package code.
|
||||||
|
|
||||||
|
// AliasType is the type of an alias in AliasMap.
|
||||||
|
type AliasType int8
|
||||||
|
|
||||||
|
const (
|
||||||
|
Deprecated AliasType = iota
|
||||||
|
Macro
|
||||||
|
Legacy
|
||||||
|
|
||||||
|
AliasTypeUnknown AliasType = -1
|
||||||
|
)
|
29
vendor/golang.org/x/text/internal/language/compact.go
generated
vendored
Normal file
29
vendor/golang.org/x/text/internal/language/compact.go
generated
vendored
Normal file
|
@ -0,0 +1,29 @@
|
||||||
|
// Copyright 2018 The Go Authors. All rights reserved.
|
||||||
|
// Use of this source code is governed by a BSD-style
|
||||||
|
// license that can be found in the LICENSE file.
|
||||||
|
|
||||||
|
package language
|
||||||
|
|
||||||
|
// CompactCoreInfo is a compact integer with the three core tags encoded.
|
||||||
|
type CompactCoreInfo uint32
|
||||||
|
|
||||||
|
// GetCompactCore generates a uint32 value that is guaranteed to be unique for
|
||||||
|
// different language, region, and script values.
|
||||||
|
func GetCompactCore(t Tag) (cci CompactCoreInfo, ok bool) {
|
||||||
|
if t.LangID > langNoIndexOffset {
|
||||||
|
return 0, false
|
||||||
|
}
|
||||||
|
cci |= CompactCoreInfo(t.LangID) << (8 + 12)
|
||||||
|
cci |= CompactCoreInfo(t.ScriptID) << 12
|
||||||
|
cci |= CompactCoreInfo(t.RegionID)
|
||||||
|
return cci, true
|
||||||
|
}
|
||||||
|
|
||||||
|
// Tag generates a tag from c.
|
||||||
|
func (c CompactCoreInfo) Tag() Tag {
|
||||||
|
return Tag{
|
||||||
|
LangID: Language(c >> 20),
|
||||||
|
RegionID: Region(c & 0x3ff),
|
||||||
|
ScriptID: Script(c>>12) & 0xff,
|
||||||
|
}
|
||||||
|
}
|
61
vendor/golang.org/x/text/internal/language/compact/compact.go
generated
vendored
Normal file
61
vendor/golang.org/x/text/internal/language/compact/compact.go
generated
vendored
Normal file
|
@ -0,0 +1,61 @@
|
||||||
|
// Copyright 2018 The Go Authors. All rights reserved.
|
||||||
|
// Use of this source code is governed by a BSD-style
|
||||||
|
// license that can be found in the LICENSE file.
|
||||||
|
|
||||||
|
// Package compact defines a compact representation of language tags.
|
||||||
|
//
|
||||||
|
// Common language tags (at least all for which locale information is defined
|
||||||
|
// in CLDR) are assigned a unique index. Each Tag is associated with such an
|
||||||
|
// ID for selecting language-related resources (such as translations) as well
|
||||||
|
// as one for selecting regional defaults (currency, number formatting, etc.)
|
||||||
|
//
|
||||||
|
// It may want to export this functionality at some point, but at this point
|
||||||
|
// this is only available for use within x/text.
|
||||||
|
package compact // import "golang.org/x/text/internal/language/compact"
|
||||||
|
|
||||||
|
import (
|
||||||
|
"sort"
|
||||||
|
"strings"
|
||||||
|
|
||||||
|
"golang.org/x/text/internal/language"
|
||||||
|
)
|
||||||
|
|
||||||
|
// ID is an integer identifying a single tag.
|
||||||
|
type ID uint16
|
||||||
|
|
||||||
|
func getCoreIndex(t language.Tag) (id ID, ok bool) {
|
||||||
|
cci, ok := language.GetCompactCore(t)
|
||||||
|
if !ok {
|
||||||
|
return 0, false
|
||||||
|
}
|
||||||
|
i := sort.Search(len(coreTags), func(i int) bool {
|
||||||
|
return cci <= coreTags[i]
|
||||||
|
})
|
||||||
|
if i == len(coreTags) || coreTags[i] != cci {
|
||||||
|
return 0, false
|
||||||
|
}
|
||||||
|
return ID(i), true
|
||||||
|
}
|
||||||
|
|
||||||
|
// Parent returns the ID of the parent or the root ID if id is already the root.
|
||||||
|
func (id ID) Parent() ID {
|
||||||
|
return parents[id]
|
||||||
|
}
|
||||||
|
|
||||||
|
// Tag converts id to an internal language Tag.
|
||||||
|
func (id ID) Tag() language.Tag {
|
||||||
|
if int(id) >= len(coreTags) {
|
||||||
|
return specialTags[int(id)-len(coreTags)]
|
||||||
|
}
|
||||||
|
return coreTags[id].Tag()
|
||||||
|
}
|
||||||
|
|
||||||
|
var specialTags []language.Tag
|
||||||
|
|
||||||
|
func init() {
|
||||||
|
tags := strings.Split(specialTagsStr, " ")
|
||||||
|
specialTags = make([]language.Tag, len(tags))
|
||||||
|
for i, t := range tags {
|
||||||
|
specialTags[i] = language.MustParse(t)
|
||||||
|
}
|
||||||
|
}
|
260
vendor/golang.org/x/text/internal/language/compact/language.go
generated
vendored
Normal file
260
vendor/golang.org/x/text/internal/language/compact/language.go
generated
vendored
Normal file
|
@ -0,0 +1,260 @@
|
||||||
|
// Copyright 2013 The Go Authors. All rights reserved.
|
||||||
|
// Use of this source code is governed by a BSD-style
|
||||||
|
// license that can be found in the LICENSE file.
|
||||||
|
|
||||||
|
//go:generate go run gen.go gen_index.go -output tables.go
|
||||||
|
//go:generate go run gen_parents.go
|
||||||
|
|
||||||
|
package compact
|
||||||
|
|
||||||
|
// TODO: Remove above NOTE after:
|
||||||
|
// - verifying that tables are dropped correctly (most notably matcher tables).
|
||||||
|
|
||||||
|
import (
|
||||||
|
"strings"
|
||||||
|
|
||||||
|
"golang.org/x/text/internal/language"
|
||||||
|
)
|
||||||
|
|
||||||
|
// Tag represents a BCP 47 language tag. It is used to specify an instance of a
|
||||||
|
// specific language or locale. All language tag values are guaranteed to be
|
||||||
|
// well-formed.
|
||||||
|
type Tag struct {
|
||||||
|
// NOTE: exported tags will become part of the public API.
|
||||||
|
language ID
|
||||||
|
locale ID
|
||||||
|
full fullTag // always a language.Tag for now.
|
||||||
|
}
|
||||||
|
|
||||||
|
const _und = 0
|
||||||
|
|
||||||
|
type fullTag interface {
|
||||||
|
IsRoot() bool
|
||||||
|
Parent() language.Tag
|
||||||
|
}
|
||||||
|
|
||||||
|
// Make a compact Tag from a fully specified internal language Tag.
|
||||||
|
func Make(t language.Tag) (tag Tag) {
|
||||||
|
if region := t.TypeForKey("rg"); len(region) == 6 && region[2:] == "zzzz" {
|
||||||
|
if r, err := language.ParseRegion(region[:2]); err == nil {
|
||||||
|
tFull := t
|
||||||
|
t, _ = t.SetTypeForKey("rg", "")
|
||||||
|
// TODO: should we not consider "va" for the language tag?
|
||||||
|
var exact1, exact2 bool
|
||||||
|
tag.language, exact1 = FromTag(t)
|
||||||
|
t.RegionID = r
|
||||||
|
tag.locale, exact2 = FromTag(t)
|
||||||
|
if !exact1 || !exact2 {
|
||||||
|
tag.full = tFull
|
||||||
|
}
|
||||||
|
return tag
|
||||||
|
}
|
||||||
|
}
|
||||||
|
lang, ok := FromTag(t)
|
||||||
|
tag.language = lang
|
||||||
|
tag.locale = lang
|
||||||
|
if !ok {
|
||||||
|
tag.full = t
|
||||||
|
}
|
||||||
|
return tag
|
||||||
|
}
|
||||||
|
|
||||||
|
// Tag returns an internal language Tag version of this tag.
|
||||||
|
func (t Tag) Tag() language.Tag {
|
||||||
|
if t.full != nil {
|
||||||
|
return t.full.(language.Tag)
|
||||||
|
}
|
||||||
|
tag := t.language.Tag()
|
||||||
|
if t.language != t.locale {
|
||||||
|
loc := t.locale.Tag()
|
||||||
|
tag, _ = tag.SetTypeForKey("rg", strings.ToLower(loc.RegionID.String())+"zzzz")
|
||||||
|
}
|
||||||
|
return tag
|
||||||
|
}
|
||||||
|
|
||||||
|
// IsCompact reports whether this tag is fully defined in terms of ID.
|
||||||
|
func (t *Tag) IsCompact() bool {
|
||||||
|
return t.full == nil
|
||||||
|
}
|
||||||
|
|
||||||
|
// MayHaveVariants reports whether a tag may have variants. If it returns false
|
||||||
|
// it is guaranteed the tag does not have variants.
|
||||||
|
func (t Tag) MayHaveVariants() bool {
|
||||||
|
return t.full != nil || int(t.language) >= len(coreTags)
|
||||||
|
}
|
||||||
|
|
||||||
|
// MayHaveExtensions reports whether a tag may have extensions. If it returns
|
||||||
|
// false it is guaranteed the tag does not have them.
|
||||||
|
func (t Tag) MayHaveExtensions() bool {
|
||||||
|
return t.full != nil ||
|
||||||
|
int(t.language) >= len(coreTags) ||
|
||||||
|
t.language != t.locale
|
||||||
|
}
|
||||||
|
|
||||||
|
// IsRoot returns true if t is equal to language "und".
|
||||||
|
func (t Tag) IsRoot() bool {
|
||||||
|
if t.full != nil {
|
||||||
|
return t.full.IsRoot()
|
||||||
|
}
|
||||||
|
return t.language == _und
|
||||||
|
}
|
||||||
|
|
||||||
|
// Parent returns the CLDR parent of t. In CLDR, missing fields in data for a
|
||||||
|
// specific language are substituted with fields from the parent language.
|
||||||
|
// The parent for a language may change for newer versions of CLDR.
|
||||||
|
func (t Tag) Parent() Tag {
|
||||||
|
if t.full != nil {
|
||||||
|
return Make(t.full.Parent())
|
||||||
|
}
|
||||||
|
if t.language != t.locale {
|
||||||
|
// Simulate stripping -u-rg-xxxxxx
|
||||||
|
return Tag{language: t.language, locale: t.language}
|
||||||
|
}
|
||||||
|
// TODO: use parent lookup table once cycle from internal package is
|
||||||
|
// removed. Probably by internalizing the table and declaring this fast
|
||||||
|
// enough.
|
||||||
|
// lang := compactID(internal.Parent(uint16(t.language)))
|
||||||
|
lang, _ := FromTag(t.language.Tag().Parent())
|
||||||
|
return Tag{language: lang, locale: lang}
|
||||||
|
}
|
||||||
|
|
||||||
|
// returns token t and the rest of the string.
|
||||||
|
func nextToken(s string) (t, tail string) {
|
||||||
|
p := strings.Index(s[1:], "-")
|
||||||
|
if p == -1 {
|
||||||
|
return s[1:], ""
|
||||||
|
}
|
||||||
|
p++
|
||||||
|
return s[1:p], s[p:]
|
||||||
|
}
|
||||||
|
|
||||||
|
// LanguageID returns an index, where 0 <= index < NumCompactTags, for tags
|
||||||
|
// for which data exists in the text repository.The index will change over time
|
||||||
|
// and should not be stored in persistent storage. If t does not match a compact
|
||||||
|
// index, exact will be false and the compact index will be returned for the
|
||||||
|
// first match after repeatedly taking the Parent of t.
|
||||||
|
func LanguageID(t Tag) (id ID, exact bool) {
|
||||||
|
return t.language, t.full == nil
|
||||||
|
}
|
||||||
|
|
||||||
|
// RegionalID returns the ID for the regional variant of this tag. This index is
|
||||||
|
// used to indicate region-specific overrides, such as default currency, default
|
||||||
|
// calendar and week data, default time cycle, and default measurement system
|
||||||
|
// and unit preferences.
|
||||||
|
//
|
||||||
|
// For instance, the tag en-GB-u-rg-uszzzz specifies British English with US
|
||||||
|
// settings for currency, number formatting, etc. The CompactIndex for this tag
|
||||||
|
// will be that for en-GB, while the RegionalID will be the one corresponding to
|
||||||
|
// en-US.
|
||||||
|
func RegionalID(t Tag) (id ID, exact bool) {
|
||||||
|
return t.locale, t.full == nil
|
||||||
|
}
|
||||||
|
|
||||||
|
// LanguageTag returns t stripped of regional variant indicators.
|
||||||
|
//
|
||||||
|
// At the moment this means it is stripped of a regional and variant subtag "rg"
|
||||||
|
// and "va" in the "u" extension.
|
||||||
|
func (t Tag) LanguageTag() Tag {
|
||||||
|
if t.full == nil {
|
||||||
|
return Tag{language: t.language, locale: t.language}
|
||||||
|
}
|
||||||
|
tt := t.Tag()
|
||||||
|
tt.SetTypeForKey("rg", "")
|
||||||
|
tt.SetTypeForKey("va", "")
|
||||||
|
return Make(tt)
|
||||||
|
}
|
||||||
|
|
||||||
|
// RegionalTag returns the regional variant of the tag.
|
||||||
|
//
|
||||||
|
// At the moment this means that the region is set from the regional subtag
|
||||||
|
// "rg" in the "u" extension.
|
||||||
|
func (t Tag) RegionalTag() Tag {
|
||||||
|
rt := Tag{language: t.locale, locale: t.locale}
|
||||||
|
if t.full == nil {
|
||||||
|
return rt
|
||||||
|
}
|
||||||
|
b := language.Builder{}
|
||||||
|
tag := t.Tag()
|
||||||
|
// tag, _ = tag.SetTypeForKey("rg", "")
|
||||||
|
b.SetTag(t.locale.Tag())
|
||||||
|
if v := tag.Variants(); v != "" {
|
||||||
|
for _, v := range strings.Split(v, "-") {
|
||||||
|
b.AddVariant(v)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
for _, e := range tag.Extensions() {
|
||||||
|
b.AddExt(e)
|
||||||
|
}
|
||||||
|
return t
|
||||||
|
}
|
||||||
|
|
||||||
|
// FromTag reports closest matching ID for an internal language Tag.
|
||||||
|
func FromTag(t language.Tag) (id ID, exact bool) {
|
||||||
|
// TODO: perhaps give more frequent tags a lower index.
|
||||||
|
// TODO: we could make the indexes stable. This will excluded some
|
||||||
|
// possibilities for optimization, so don't do this quite yet.
|
||||||
|
exact = true
|
||||||
|
|
||||||
|
b, s, r := t.Raw()
|
||||||
|
if t.HasString() {
|
||||||
|
if t.IsPrivateUse() {
|
||||||
|
// We have no entries for user-defined tags.
|
||||||
|
return 0, false
|
||||||
|
}
|
||||||
|
hasExtra := false
|
||||||
|
if t.HasVariants() {
|
||||||
|
if t.HasExtensions() {
|
||||||
|
build := language.Builder{}
|
||||||
|
build.SetTag(language.Tag{LangID: b, ScriptID: s, RegionID: r})
|
||||||
|
build.AddVariant(t.Variants())
|
||||||
|
exact = false
|
||||||
|
t = build.Make()
|
||||||
|
}
|
||||||
|
hasExtra = true
|
||||||
|
} else if _, ok := t.Extension('u'); ok {
|
||||||
|
// TODO: va may mean something else. Consider not considering it.
|
||||||
|
// Strip all but the 'va' entry.
|
||||||
|
old := t
|
||||||
|
variant := t.TypeForKey("va")
|
||||||
|
t = language.Tag{LangID: b, ScriptID: s, RegionID: r}
|
||||||
|
if variant != "" {
|
||||||
|
t, _ = t.SetTypeForKey("va", variant)
|
||||||
|
hasExtra = true
|
||||||
|
}
|
||||||
|
exact = old == t
|
||||||
|
} else {
|
||||||
|
exact = false
|
||||||
|
}
|
||||||
|
if hasExtra {
|
||||||
|
// We have some variants.
|
||||||
|
for i, s := range specialTags {
|
||||||
|
if s == t {
|
||||||
|
return ID(i + len(coreTags)), exact
|
||||||
|
}
|
||||||
|
}
|
||||||
|
exact = false
|
||||||
|
}
|
||||||
|
}
|
||||||
|
if x, ok := getCoreIndex(t); ok {
|
||||||
|
return x, exact
|
||||||
|
}
|
||||||
|
exact = false
|
||||||
|
if r != 0 && s == 0 {
|
||||||
|
// Deal with cases where an extra script is inserted for the region.
|
||||||
|
t, _ := t.Maximize()
|
||||||
|
if x, ok := getCoreIndex(t); ok {
|
||||||
|
return x, exact
|
||||||
|
}
|
||||||
|
}
|
||||||
|
for t = t.Parent(); t != root; t = t.Parent() {
|
||||||
|
// No variants specified: just compare core components.
|
||||||
|
// The key has the form lllssrrr, where l, s, and r are nibbles for
|
||||||
|
// respectively the langID, scriptID, and regionID.
|
||||||
|
if x, ok := getCoreIndex(t); ok {
|
||||||
|
return x, exact
|
||||||
|
}
|
||||||
|
}
|
||||||
|
return 0, exact
|
||||||
|
}
|
||||||
|
|
||||||
|
var root = language.Tag{}
|
120
vendor/golang.org/x/text/internal/language/compact/parents.go
generated
vendored
Normal file
120
vendor/golang.org/x/text/internal/language/compact/parents.go
generated
vendored
Normal file
|
@ -0,0 +1,120 @@
|
||||||
|
// Code generated by running "go generate" in golang.org/x/text. DO NOT EDIT.
|
||||||
|
|
||||||
|
package compact
|
||||||
|
|
||||||
|
// parents maps a compact index of a tag to the compact index of the parent of
|
||||||
|
// this tag.
|
||||||
|
var parents = []ID{ // 775 elements
|
||||||
|
// Entry 0 - 3F
|
||||||
|
0x0000, 0x0000, 0x0001, 0x0001, 0x0000, 0x0004, 0x0000, 0x0006,
|
||||||
|
0x0000, 0x0008, 0x0000, 0x000a, 0x000a, 0x000a, 0x000a, 0x000a,
|
||||||
|
0x000a, 0x000a, 0x000a, 0x000a, 0x000a, 0x000a, 0x000a, 0x000a,
|
||||||
|
0x000a, 0x000a, 0x000a, 0x000a, 0x000a, 0x000a, 0x000a, 0x000a,
|
||||||
|
0x000a, 0x000a, 0x000a, 0x000a, 0x000a, 0x000a, 0x000a, 0x0000,
|
||||||
|
0x0000, 0x0028, 0x0000, 0x002a, 0x0000, 0x002c, 0x0000, 0x0000,
|
||||||
|
0x002f, 0x002e, 0x002e, 0x0000, 0x0033, 0x0000, 0x0035, 0x0000,
|
||||||
|
0x0037, 0x0000, 0x0039, 0x0000, 0x003b, 0x0000, 0x0000, 0x003e,
|
||||||
|
// Entry 40 - 7F
|
||||||
|
0x0000, 0x0040, 0x0040, 0x0000, 0x0043, 0x0043, 0x0000, 0x0046,
|
||||||
|
0x0000, 0x0048, 0x0000, 0x0000, 0x004b, 0x004a, 0x004a, 0x0000,
|
||||||
|
0x004f, 0x004f, 0x004f, 0x004f, 0x0000, 0x0054, 0x0054, 0x0000,
|
||||||
|
0x0057, 0x0000, 0x0059, 0x0000, 0x005b, 0x0000, 0x005d, 0x005d,
|
||||||
|
0x0000, 0x0060, 0x0000, 0x0062, 0x0000, 0x0064, 0x0000, 0x0066,
|
||||||
|
0x0066, 0x0000, 0x0069, 0x0000, 0x006b, 0x006b, 0x006b, 0x006b,
|
||||||
|
0x006b, 0x006b, 0x006b, 0x0000, 0x0073, 0x0000, 0x0075, 0x0000,
|
||||||
|
0x0077, 0x0000, 0x0000, 0x007a, 0x0000, 0x007c, 0x0000, 0x007e,
|
||||||
|
// Entry 80 - BF
|
||||||
|
0x0000, 0x0080, 0x0080, 0x0000, 0x0083, 0x0083, 0x0000, 0x0086,
|
||||||
|
0x0087, 0x0087, 0x0087, 0x0086, 0x0088, 0x0087, 0x0087, 0x0087,
|
||||||
|
0x0086, 0x0087, 0x0087, 0x0087, 0x0087, 0x0087, 0x0087, 0x0088,
|
||||||
|
0x0087, 0x0087, 0x0087, 0x0087, 0x0088, 0x0087, 0x0088, 0x0087,
|
||||||
|
0x0087, 0x0088, 0x0087, 0x0087, 0x0087, 0x0087, 0x0087, 0x0087,
|
||||||
|
0x0087, 0x0087, 0x0087, 0x0086, 0x0087, 0x0087, 0x0087, 0x0087,
|
||||||
|
0x0087, 0x0087, 0x0087, 0x0087, 0x0087, 0x0087, 0x0087, 0x0087,
|
||||||
|
0x0087, 0x0087, 0x0087, 0x0087, 0x0087, 0x0086, 0x0087, 0x0086,
|
||||||
|
// Entry C0 - FF
|
||||||
|
0x0087, 0x0087, 0x0087, 0x0087, 0x0087, 0x0087, 0x0087, 0x0087,
|
||||||
|
0x0088, 0x0087, 0x0087, 0x0087, 0x0087, 0x0087, 0x0087, 0x0087,
|
||||||
|
0x0086, 0x0087, 0x0087, 0x0087, 0x0087, 0x0087, 0x0088, 0x0087,
|
||||||
|
0x0087, 0x0088, 0x0087, 0x0087, 0x0087, 0x0087, 0x0087, 0x0087,
|
||||||
|
0x0087, 0x0087, 0x0087, 0x0087, 0x0087, 0x0086, 0x0086, 0x0087,
|
||||||
|
0x0087, 0x0086, 0x0087, 0x0087, 0x0087, 0x0087, 0x0087, 0x0000,
|
||||||
|
0x00ef, 0x0000, 0x00f1, 0x00f2, 0x00f2, 0x00f2, 0x00f2, 0x00f2,
|
||||||
|
0x00f2, 0x00f2, 0x00f2, 0x00f2, 0x00f1, 0x00f2, 0x00f1, 0x00f1,
|
||||||
|
// Entry 100 - 13F
|
||||||
|
0x00f2, 0x00f2, 0x00f1, 0x00f2, 0x00f2, 0x00f2, 0x00f2, 0x00f1,
|
||||||
|
0x00f2, 0x00f2, 0x00f2, 0x00f2, 0x00f2, 0x00f2, 0x0000, 0x010e,
|
||||||
|
0x0000, 0x0110, 0x0000, 0x0112, 0x0000, 0x0114, 0x0114, 0x0000,
|
||||||
|
0x0117, 0x0117, 0x0117, 0x0117, 0x0000, 0x011c, 0x0000, 0x011e,
|
||||||
|
0x0000, 0x0120, 0x0120, 0x0000, 0x0123, 0x0123, 0x0123, 0x0123,
|
||||||
|
0x0123, 0x0123, 0x0123, 0x0123, 0x0123, 0x0123, 0x0123, 0x0123,
|
||||||
|
0x0123, 0x0123, 0x0123, 0x0123, 0x0123, 0x0123, 0x0123, 0x0123,
|
||||||
|
0x0123, 0x0123, 0x0123, 0x0123, 0x0123, 0x0123, 0x0123, 0x0123,
|
||||||
|
// Entry 140 - 17F
|
||||||
|
0x0123, 0x0123, 0x0123, 0x0123, 0x0123, 0x0123, 0x0123, 0x0123,
|
||||||
|
0x0123, 0x0123, 0x0123, 0x0123, 0x0123, 0x0123, 0x0123, 0x0123,
|
||||||
|
0x0123, 0x0123, 0x0000, 0x0152, 0x0000, 0x0154, 0x0000, 0x0156,
|
||||||
|
0x0000, 0x0158, 0x0000, 0x015a, 0x0000, 0x015c, 0x015c, 0x015c,
|
||||||
|
0x0000, 0x0160, 0x0000, 0x0000, 0x0163, 0x0000, 0x0165, 0x0000,
|
||||||
|
0x0167, 0x0167, 0x0167, 0x0000, 0x016b, 0x0000, 0x016d, 0x0000,
|
||||||
|
0x016f, 0x0000, 0x0171, 0x0171, 0x0000, 0x0174, 0x0000, 0x0176,
|
||||||
|
0x0000, 0x0178, 0x0000, 0x017a, 0x0000, 0x017c, 0x0000, 0x017e,
|
||||||
|
// Entry 180 - 1BF
|
||||||
|
0x0000, 0x0000, 0x0000, 0x0182, 0x0000, 0x0184, 0x0184, 0x0184,
|
||||||
|
0x0184, 0x0000, 0x0000, 0x0000, 0x018b, 0x0000, 0x0000, 0x018e,
|
||||||
|
0x0000, 0x0000, 0x0191, 0x0000, 0x0000, 0x0000, 0x0195, 0x0000,
|
||||||
|
0x0197, 0x0000, 0x0000, 0x019a, 0x0000, 0x0000, 0x019d, 0x0000,
|
||||||
|
0x019f, 0x0000, 0x01a1, 0x0000, 0x01a3, 0x0000, 0x01a5, 0x0000,
|
||||||
|
0x01a7, 0x0000, 0x01a9, 0x0000, 0x01ab, 0x0000, 0x01ad, 0x0000,
|
||||||
|
0x01af, 0x0000, 0x01b1, 0x01b1, 0x0000, 0x01b4, 0x0000, 0x01b6,
|
||||||
|
0x0000, 0x01b8, 0x0000, 0x01ba, 0x0000, 0x01bc, 0x0000, 0x0000,
|
||||||
|
// Entry 1C0 - 1FF
|
||||||
|
0x01bf, 0x0000, 0x01c1, 0x0000, 0x01c3, 0x0000, 0x01c5, 0x0000,
|
||||||
|
0x01c7, 0x0000, 0x01c9, 0x0000, 0x01cb, 0x01cb, 0x01cb, 0x01cb,
|
||||||
|
0x0000, 0x01d0, 0x0000, 0x01d2, 0x01d2, 0x0000, 0x01d5, 0x0000,
|
||||||
|
0x01d7, 0x0000, 0x01d9, 0x0000, 0x01db, 0x0000, 0x01dd, 0x0000,
|
||||||
|
0x01df, 0x01df, 0x0000, 0x01e2, 0x0000, 0x01e4, 0x0000, 0x01e6,
|
||||||
|
0x0000, 0x01e8, 0x0000, 0x01ea, 0x0000, 0x01ec, 0x0000, 0x01ee,
|
||||||
|
0x0000, 0x01f0, 0x0000, 0x0000, 0x01f3, 0x0000, 0x01f5, 0x01f5,
|
||||||
|
0x01f5, 0x0000, 0x01f9, 0x0000, 0x01fb, 0x0000, 0x01fd, 0x0000,
|
||||||
|
// Entry 200 - 23F
|
||||||
|
0x01ff, 0x0000, 0x0000, 0x0202, 0x0000, 0x0204, 0x0204, 0x0000,
|
||||||
|
0x0207, 0x0000, 0x0209, 0x0209, 0x0000, 0x020c, 0x020c, 0x0000,
|
||||||
|
0x020f, 0x020f, 0x020f, 0x020f, 0x020f, 0x020f, 0x020f, 0x0000,
|
||||||
|
0x0217, 0x0000, 0x0219, 0x0000, 0x021b, 0x0000, 0x0000, 0x0000,
|
||||||
|
0x0000, 0x0000, 0x0221, 0x0000, 0x0000, 0x0224, 0x0000, 0x0226,
|
||||||
|
0x0226, 0x0000, 0x0229, 0x0000, 0x022b, 0x022b, 0x0000, 0x0000,
|
||||||
|
0x022f, 0x022e, 0x022e, 0x0000, 0x0000, 0x0234, 0x0000, 0x0236,
|
||||||
|
0x0000, 0x0238, 0x0000, 0x0244, 0x023a, 0x0244, 0x0244, 0x0244,
|
||||||
|
// Entry 240 - 27F
|
||||||
|
0x0244, 0x0244, 0x0244, 0x0244, 0x023a, 0x0244, 0x0244, 0x0000,
|
||||||
|
0x0247, 0x0247, 0x0247, 0x0000, 0x024b, 0x0000, 0x024d, 0x0000,
|
||||||
|
0x024f, 0x024f, 0x0000, 0x0252, 0x0000, 0x0254, 0x0254, 0x0254,
|
||||||
|
0x0254, 0x0254, 0x0254, 0x0000, 0x025b, 0x0000, 0x025d, 0x0000,
|
||||||
|
0x025f, 0x0000, 0x0261, 0x0000, 0x0263, 0x0000, 0x0265, 0x0000,
|
||||||
|
0x0000, 0x0268, 0x0268, 0x0268, 0x0000, 0x026c, 0x0000, 0x026e,
|
||||||
|
0x0000, 0x0270, 0x0000, 0x0000, 0x0000, 0x0274, 0x0273, 0x0273,
|
||||||
|
0x0000, 0x0278, 0x0000, 0x027a, 0x0000, 0x027c, 0x0000, 0x0000,
|
||||||
|
// Entry 280 - 2BF
|
||||||
|
0x0000, 0x0000, 0x0281, 0x0000, 0x0000, 0x0284, 0x0000, 0x0286,
|
||||||
|
0x0286, 0x0286, 0x0286, 0x0000, 0x028b, 0x028b, 0x028b, 0x0000,
|
||||||
|
0x028f, 0x028f, 0x028f, 0x028f, 0x028f, 0x0000, 0x0295, 0x0295,
|
||||||
|
0x0295, 0x0295, 0x0000, 0x0000, 0x0000, 0x0000, 0x029d, 0x029d,
|
||||||
|
0x029d, 0x0000, 0x02a1, 0x02a1, 0x02a1, 0x02a1, 0x0000, 0x0000,
|
||||||
|
0x02a7, 0x02a7, 0x02a7, 0x02a7, 0x0000, 0x02ac, 0x0000, 0x02ae,
|
||||||
|
0x02ae, 0x0000, 0x02b1, 0x0000, 0x02b3, 0x0000, 0x02b5, 0x02b5,
|
||||||
|
0x0000, 0x0000, 0x02b9, 0x0000, 0x0000, 0x0000, 0x02bd, 0x0000,
|
||||||
|
// Entry 2C0 - 2FF
|
||||||
|
0x02bf, 0x02bf, 0x0000, 0x0000, 0x02c3, 0x0000, 0x02c5, 0x0000,
|
||||||
|
0x02c7, 0x0000, 0x02c9, 0x0000, 0x02cb, 0x0000, 0x02cd, 0x02cd,
|
||||||
|
0x0000, 0x0000, 0x02d1, 0x0000, 0x02d3, 0x02d0, 0x02d0, 0x0000,
|
||||||
|
0x0000, 0x02d8, 0x02d7, 0x02d7, 0x0000, 0x0000, 0x02dd, 0x0000,
|
||||||
|
0x02df, 0x0000, 0x02e1, 0x0000, 0x0000, 0x02e4, 0x0000, 0x02e6,
|
||||||
|
0x0000, 0x0000, 0x02e9, 0x0000, 0x02eb, 0x0000, 0x02ed, 0x0000,
|
||||||
|
0x02ef, 0x02ef, 0x0000, 0x0000, 0x02f3, 0x02f2, 0x02f2, 0x0000,
|
||||||
|
0x02f7, 0x0000, 0x02f9, 0x02f9, 0x02f9, 0x02f9, 0x02f9, 0x0000,
|
||||||
|
// Entry 300 - 33F
|
||||||
|
0x02ff, 0x0300, 0x02ff, 0x0000, 0x0303, 0x0051, 0x00e6,
|
||||||
|
} // Size: 1574 bytes
|
||||||
|
|
||||||
|
// Total table size 1574 bytes (1KiB); checksum: 895AAF0B
|
1015
vendor/golang.org/x/text/internal/language/compact/tables.go
generated
vendored
Normal file
1015
vendor/golang.org/x/text/internal/language/compact/tables.go
generated
vendored
Normal file
File diff suppressed because it is too large
Load diff
91
vendor/golang.org/x/text/internal/language/compact/tags.go
generated
vendored
Normal file
91
vendor/golang.org/x/text/internal/language/compact/tags.go
generated
vendored
Normal file
|
@ -0,0 +1,91 @@
|
||||||
|
// Copyright 2013 The Go Authors. All rights reserved.
|
||||||
|
// Use of this source code is governed by a BSD-style
|
||||||
|
// license that can be found in the LICENSE file.
|
||||||
|
|
||||||
|
package compact
|
||||||
|
|
||||||
|
var (
|
||||||
|
und = Tag{}
|
||||||
|
|
||||||
|
Und Tag = Tag{}
|
||||||
|
|
||||||
|
Afrikaans Tag = Tag{language: afIndex, locale: afIndex}
|
||||||
|
Amharic Tag = Tag{language: amIndex, locale: amIndex}
|
||||||
|
Arabic Tag = Tag{language: arIndex, locale: arIndex}
|
||||||
|
ModernStandardArabic Tag = Tag{language: ar001Index, locale: ar001Index}
|
||||||
|
Azerbaijani Tag = Tag{language: azIndex, locale: azIndex}
|
||||||
|
Bulgarian Tag = Tag{language: bgIndex, locale: bgIndex}
|
||||||
|
Bengali Tag = Tag{language: bnIndex, locale: bnIndex}
|
||||||
|
Catalan Tag = Tag{language: caIndex, locale: caIndex}
|
||||||
|
Czech Tag = Tag{language: csIndex, locale: csIndex}
|
||||||
|
Danish Tag = Tag{language: daIndex, locale: daIndex}
|
||||||
|
German Tag = Tag{language: deIndex, locale: deIndex}
|
||||||
|
Greek Tag = Tag{language: elIndex, locale: elIndex}
|
||||||
|
English Tag = Tag{language: enIndex, locale: enIndex}
|
||||||
|
AmericanEnglish Tag = Tag{language: enUSIndex, locale: enUSIndex}
|
||||||
|
BritishEnglish Tag = Tag{language: enGBIndex, locale: enGBIndex}
|
||||||
|
Spanish Tag = Tag{language: esIndex, locale: esIndex}
|
||||||
|
EuropeanSpanish Tag = Tag{language: esESIndex, locale: esESIndex}
|
||||||
|
LatinAmericanSpanish Tag = Tag{language: es419Index, locale: es419Index}
|
||||||
|
Estonian Tag = Tag{language: etIndex, locale: etIndex}
|
||||||
|
Persian Tag = Tag{language: faIndex, locale: faIndex}
|
||||||
|
Finnish Tag = Tag{language: fiIndex, locale: fiIndex}
|
||||||
|
Filipino Tag = Tag{language: filIndex, locale: filIndex}
|
||||||
|
French Tag = Tag{language: frIndex, locale: frIndex}
|
||||||
|
CanadianFrench Tag = Tag{language: frCAIndex, locale: frCAIndex}
|
||||||
|
Gujarati Tag = Tag{language: guIndex, locale: guIndex}
|
||||||
|
Hebrew Tag = Tag{language: heIndex, locale: heIndex}
|
||||||
|
Hindi Tag = Tag{language: hiIndex, locale: hiIndex}
|
||||||
|
Croatian Tag = Tag{language: hrIndex, locale: hrIndex}
|
||||||
|
Hungarian Tag = Tag{language: huIndex, locale: huIndex}
|
||||||
|
Armenian Tag = Tag{language: hyIndex, locale: hyIndex}
|
||||||
|
Indonesian Tag = Tag{language: idIndex, locale: idIndex}
|
||||||
|
Icelandic Tag = Tag{language: isIndex, locale: isIndex}
|
||||||
|
Italian Tag = Tag{language: itIndex, locale: itIndex}
|
||||||
|
Japanese Tag = Tag{language: jaIndex, locale: jaIndex}
|
||||||
|
Georgian Tag = Tag{language: kaIndex, locale: kaIndex}
|
||||||
|
Kazakh Tag = Tag{language: kkIndex, locale: kkIndex}
|
||||||
|
Khmer Tag = Tag{language: kmIndex, locale: kmIndex}
|
||||||
|
Kannada Tag = Tag{language: knIndex, locale: knIndex}
|
||||||
|
Korean Tag = Tag{language: koIndex, locale: koIndex}
|
||||||
|
Kirghiz Tag = Tag{language: kyIndex, locale: kyIndex}
|
||||||
|
Lao Tag = Tag{language: loIndex, locale: loIndex}
|
||||||
|
Lithuanian Tag = Tag{language: ltIndex, locale: ltIndex}
|
||||||
|
Latvian Tag = Tag{language: lvIndex, locale: lvIndex}
|
||||||
|
Macedonian Tag = Tag{language: mkIndex, locale: mkIndex}
|
||||||
|
Malayalam Tag = Tag{language: mlIndex, locale: mlIndex}
|
||||||
|
Mongolian Tag = Tag{language: mnIndex, locale: mnIndex}
|
||||||
|
Marathi Tag = Tag{language: mrIndex, locale: mrIndex}
|
||||||
|
Malay Tag = Tag{language: msIndex, locale: msIndex}
|
||||||
|
Burmese Tag = Tag{language: myIndex, locale: myIndex}
|
||||||
|
Nepali Tag = Tag{language: neIndex, locale: neIndex}
|
||||||
|
Dutch Tag = Tag{language: nlIndex, locale: nlIndex}
|
||||||
|
Norwegian Tag = Tag{language: noIndex, locale: noIndex}
|
||||||
|
Punjabi Tag = Tag{language: paIndex, locale: paIndex}
|
||||||
|
Polish Tag = Tag{language: plIndex, locale: plIndex}
|
||||||
|
Portuguese Tag = Tag{language: ptIndex, locale: ptIndex}
|
||||||
|
BrazilianPortuguese Tag = Tag{language: ptBRIndex, locale: ptBRIndex}
|
||||||
|
EuropeanPortuguese Tag = Tag{language: ptPTIndex, locale: ptPTIndex}
|
||||||
|
Romanian Tag = Tag{language: roIndex, locale: roIndex}
|
||||||
|
Russian Tag = Tag{language: ruIndex, locale: ruIndex}
|
||||||
|
Sinhala Tag = Tag{language: siIndex, locale: siIndex}
|
||||||
|
Slovak Tag = Tag{language: skIndex, locale: skIndex}
|
||||||
|
Slovenian Tag = Tag{language: slIndex, locale: slIndex}
|
||||||
|
Albanian Tag = Tag{language: sqIndex, locale: sqIndex}
|
||||||
|
Serbian Tag = Tag{language: srIndex, locale: srIndex}
|
||||||
|
SerbianLatin Tag = Tag{language: srLatnIndex, locale: srLatnIndex}
|
||||||
|
Swedish Tag = Tag{language: svIndex, locale: svIndex}
|
||||||
|
Swahili Tag = Tag{language: swIndex, locale: swIndex}
|
||||||
|
Tamil Tag = Tag{language: taIndex, locale: taIndex}
|
||||||
|
Telugu Tag = Tag{language: teIndex, locale: teIndex}
|
||||||
|
Thai Tag = Tag{language: thIndex, locale: thIndex}
|
||||||
|
Turkish Tag = Tag{language: trIndex, locale: trIndex}
|
||||||
|
Ukrainian Tag = Tag{language: ukIndex, locale: ukIndex}
|
||||||
|
Urdu Tag = Tag{language: urIndex, locale: urIndex}
|
||||||
|
Uzbek Tag = Tag{language: uzIndex, locale: uzIndex}
|
||||||
|
Vietnamese Tag = Tag{language: viIndex, locale: viIndex}
|
||||||
|
Chinese Tag = Tag{language: zhIndex, locale: zhIndex}
|
||||||
|
SimplifiedChinese Tag = Tag{language: zhHansIndex, locale: zhHansIndex}
|
||||||
|
TraditionalChinese Tag = Tag{language: zhHantIndex, locale: zhHantIndex}
|
||||||
|
Zulu Tag = Tag{language: zuIndex, locale: zuIndex}
|
||||||
|
)
|
167
vendor/golang.org/x/text/internal/language/compose.go
generated
vendored
Normal file
167
vendor/golang.org/x/text/internal/language/compose.go
generated
vendored
Normal file
|
@ -0,0 +1,167 @@
|
||||||
|
// Copyright 2018 The Go Authors. All rights reserved.
|
||||||
|
// Use of this source code is governed by a BSD-style
|
||||||
|
// license that can be found in the LICENSE file.
|
||||||
|
|
||||||
|
package language
|
||||||
|
|
||||||
|
import (
|
||||||
|
"sort"
|
||||||
|
"strings"
|
||||||
|
)
|
||||||
|
|
||||||
|
// A Builder allows constructing a Tag from individual components.
|
||||||
|
// Its main user is Compose in the top-level language package.
|
||||||
|
type Builder struct {
|
||||||
|
Tag Tag
|
||||||
|
|
||||||
|
private string // the x extension
|
||||||
|
variants []string
|
||||||
|
extensions []string
|
||||||
|
}
|
||||||
|
|
||||||
|
// Make returns a new Tag from the current settings.
|
||||||
|
func (b *Builder) Make() Tag {
|
||||||
|
t := b.Tag
|
||||||
|
|
||||||
|
if len(b.extensions) > 0 || len(b.variants) > 0 {
|
||||||
|
sort.Sort(sortVariants(b.variants))
|
||||||
|
sort.Strings(b.extensions)
|
||||||
|
|
||||||
|
if b.private != "" {
|
||||||
|
b.extensions = append(b.extensions, b.private)
|
||||||
|
}
|
||||||
|
n := maxCoreSize + tokenLen(b.variants...) + tokenLen(b.extensions...)
|
||||||
|
buf := make([]byte, n)
|
||||||
|
p := t.genCoreBytes(buf)
|
||||||
|
t.pVariant = byte(p)
|
||||||
|
p += appendTokens(buf[p:], b.variants...)
|
||||||
|
t.pExt = uint16(p)
|
||||||
|
p += appendTokens(buf[p:], b.extensions...)
|
||||||
|
t.str = string(buf[:p])
|
||||||
|
// We may not always need to remake the string, but when or when not
|
||||||
|
// to do so is rather tricky.
|
||||||
|
scan := makeScanner(buf[:p])
|
||||||
|
t, _ = parse(&scan, "")
|
||||||
|
return t
|
||||||
|
|
||||||
|
} else if b.private != "" {
|
||||||
|
t.str = b.private
|
||||||
|
t.RemakeString()
|
||||||
|
}
|
||||||
|
return t
|
||||||
|
}
|
||||||
|
|
||||||
|
// SetTag copies all the settings from a given Tag. Any previously set values
|
||||||
|
// are discarded.
|
||||||
|
func (b *Builder) SetTag(t Tag) {
|
||||||
|
b.Tag.LangID = t.LangID
|
||||||
|
b.Tag.RegionID = t.RegionID
|
||||||
|
b.Tag.ScriptID = t.ScriptID
|
||||||
|
// TODO: optimize
|
||||||
|
b.variants = b.variants[:0]
|
||||||
|
if variants := t.Variants(); variants != "" {
|
||||||
|
for _, vr := range strings.Split(variants[1:], "-") {
|
||||||
|
b.variants = append(b.variants, vr)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
b.extensions, b.private = b.extensions[:0], ""
|
||||||
|
for _, e := range t.Extensions() {
|
||||||
|
b.AddExt(e)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// AddExt adds extension e to the tag. e must be a valid extension as returned
|
||||||
|
// by Tag.Extension. If the extension already exists, it will be discarded,
|
||||||
|
// except for a -u extension, where non-existing key-type pairs will added.
|
||||||
|
func (b *Builder) AddExt(e string) {
|
||||||
|
if e[0] == 'x' {
|
||||||
|
if b.private == "" {
|
||||||
|
b.private = e
|
||||||
|
}
|
||||||
|
return
|
||||||
|
}
|
||||||
|
for i, s := range b.extensions {
|
||||||
|
if s[0] == e[0] {
|
||||||
|
if e[0] == 'u' {
|
||||||
|
b.extensions[i] += e[1:]
|
||||||
|
}
|
||||||
|
return
|
||||||
|
}
|
||||||
|
}
|
||||||
|
b.extensions = append(b.extensions, e)
|
||||||
|
}
|
||||||
|
|
||||||
|
// SetExt sets the extension e to the tag. e must be a valid extension as
|
||||||
|
// returned by Tag.Extension. If the extension already exists, it will be
|
||||||
|
// overwritten, except for a -u extension, where the individual key-type pairs
|
||||||
|
// will be set.
|
||||||
|
func (b *Builder) SetExt(e string) {
|
||||||
|
if e[0] == 'x' {
|
||||||
|
b.private = e
|
||||||
|
return
|
||||||
|
}
|
||||||
|
for i, s := range b.extensions {
|
||||||
|
if s[0] == e[0] {
|
||||||
|
if e[0] == 'u' {
|
||||||
|
b.extensions[i] = e + s[1:]
|
||||||
|
} else {
|
||||||
|
b.extensions[i] = e
|
||||||
|
}
|
||||||
|
return
|
||||||
|
}
|
||||||
|
}
|
||||||
|
b.extensions = append(b.extensions, e)
|
||||||
|
}
|
||||||
|
|
||||||
|
// AddVariant adds any number of variants.
|
||||||
|
func (b *Builder) AddVariant(v ...string) {
|
||||||
|
for _, v := range v {
|
||||||
|
if v != "" {
|
||||||
|
b.variants = append(b.variants, v)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// ClearVariants removes any variants previously added, including those
|
||||||
|
// copied from a Tag in SetTag.
|
||||||
|
func (b *Builder) ClearVariants() {
|
||||||
|
b.variants = b.variants[:0]
|
||||||
|
}
|
||||||
|
|
||||||
|
// ClearExtensions removes any extensions previously added, including those
|
||||||
|
// copied from a Tag in SetTag.
|
||||||
|
func (b *Builder) ClearExtensions() {
|
||||||
|
b.private = ""
|
||||||
|
b.extensions = b.extensions[:0]
|
||||||
|
}
|
||||||
|
|
||||||
|
func tokenLen(token ...string) (n int) {
|
||||||
|
for _, t := range token {
|
||||||
|
n += len(t) + 1
|
||||||
|
}
|
||||||
|
return
|
||||||
|
}
|
||||||
|
|
||||||
|
func appendTokens(b []byte, token ...string) int {
|
||||||
|
p := 0
|
||||||
|
for _, t := range token {
|
||||||
|
b[p] = '-'
|
||||||
|
copy(b[p+1:], t)
|
||||||
|
p += 1 + len(t)
|
||||||
|
}
|
||||||
|
return p
|
||||||
|
}
|
||||||
|
|
||||||
|
type sortVariants []string
|
||||||
|
|
||||||
|
func (s sortVariants) Len() int {
|
||||||
|
return len(s)
|
||||||
|
}
|
||||||
|
|
||||||
|
func (s sortVariants) Swap(i, j int) {
|
||||||
|
s[j], s[i] = s[i], s[j]
|
||||||
|
}
|
||||||
|
|
||||||
|
func (s sortVariants) Less(i, j int) bool {
|
||||||
|
return variantIndex[s[i]] < variantIndex[s[j]]
|
||||||
|
}
|
28
vendor/golang.org/x/text/internal/language/coverage.go
generated
vendored
Normal file
28
vendor/golang.org/x/text/internal/language/coverage.go
generated
vendored
Normal file
|
@ -0,0 +1,28 @@
|
||||||
|
// Copyright 2014 The Go Authors. All rights reserved.
|
||||||
|
// Use of this source code is governed by a BSD-style
|
||||||
|
// license that can be found in the LICENSE file.
|
||||||
|
|
||||||
|
package language
|
||||||
|
|
||||||
|
// BaseLanguages returns the list of all supported base languages. It generates
|
||||||
|
// the list by traversing the internal structures.
|
||||||
|
func BaseLanguages() []Language {
|
||||||
|
base := make([]Language, 0, NumLanguages)
|
||||||
|
for i := 0; i < langNoIndexOffset; i++ {
|
||||||
|
// We included "und" already for the value 0.
|
||||||
|
if i != nonCanonicalUnd {
|
||||||
|
base = append(base, Language(i))
|
||||||
|
}
|
||||||
|
}
|
||||||
|
i := langNoIndexOffset
|
||||||
|
for _, v := range langNoIndex {
|
||||||
|
for k := 0; k < 8; k++ {
|
||||||
|
if v&1 == 1 {
|
||||||
|
base = append(base, Language(i))
|
||||||
|
}
|
||||||
|
v >>= 1
|
||||||
|
i++
|
||||||
|
}
|
||||||
|
}
|
||||||
|
return base
|
||||||
|
}
|
596
vendor/golang.org/x/text/internal/language/language.go
generated
vendored
Normal file
596
vendor/golang.org/x/text/internal/language/language.go
generated
vendored
Normal file
|
@ -0,0 +1,596 @@
|
||||||
|
// Copyright 2013 The Go Authors. All rights reserved.
|
||||||
|
// Use of this source code is governed by a BSD-style
|
||||||
|
// license that can be found in the LICENSE file.
|
||||||
|
|
||||||
|
//go:generate go run gen.go gen_common.go -output tables.go
|
||||||
|
|
||||||
|
package language // import "golang.org/x/text/internal/language"
|
||||||
|
|
||||||
|
// TODO: Remove above NOTE after:
|
||||||
|
// - verifying that tables are dropped correctly (most notably matcher tables).
|
||||||
|
|
||||||
|
import (
|
||||||
|
"errors"
|
||||||
|
"fmt"
|
||||||
|
"strings"
|
||||||
|
)
|
||||||
|
|
||||||
|
const (
|
||||||
|
// maxCoreSize is the maximum size of a BCP 47 tag without variants and
|
||||||
|
// extensions. Equals max lang (3) + script (4) + max reg (3) + 2 dashes.
|
||||||
|
maxCoreSize = 12
|
||||||
|
|
||||||
|
// max99thPercentileSize is a somewhat arbitrary buffer size that presumably
|
||||||
|
// is large enough to hold at least 99% of the BCP 47 tags.
|
||||||
|
max99thPercentileSize = 32
|
||||||
|
|
||||||
|
// maxSimpleUExtensionSize is the maximum size of a -u extension with one
|
||||||
|
// key-type pair. Equals len("-u-") + key (2) + dash + max value (8).
|
||||||
|
maxSimpleUExtensionSize = 14
|
||||||
|
)
|
||||||
|
|
||||||
|
// Tag represents a BCP 47 language tag. It is used to specify an instance of a
|
||||||
|
// specific language or locale. All language tag values are guaranteed to be
|
||||||
|
// well-formed. The zero value of Tag is Und.
|
||||||
|
type Tag struct {
|
||||||
|
// TODO: the following fields have the form TagTypeID. This name is chosen
|
||||||
|
// to allow refactoring the public package without conflicting with its
|
||||||
|
// Base, Script, and Region methods. Once the transition is fully completed
|
||||||
|
// the ID can be stripped from the name.
|
||||||
|
|
||||||
|
LangID Language
|
||||||
|
RegionID Region
|
||||||
|
// TODO: we will soon run out of positions for ScriptID. Idea: instead of
|
||||||
|
// storing lang, region, and ScriptID codes, store only the compact index and
|
||||||
|
// have a lookup table from this code to its expansion. This greatly speeds
|
||||||
|
// up table lookup, speed up common variant cases.
|
||||||
|
// This will also immediately free up 3 extra bytes. Also, the pVariant
|
||||||
|
// field can now be moved to the lookup table, as the compact index uniquely
|
||||||
|
// determines the offset of a possible variant.
|
||||||
|
ScriptID Script
|
||||||
|
pVariant byte // offset in str, includes preceding '-'
|
||||||
|
pExt uint16 // offset of first extension, includes preceding '-'
|
||||||
|
|
||||||
|
// str is the string representation of the Tag. It will only be used if the
|
||||||
|
// tag has variants or extensions.
|
||||||
|
str string
|
||||||
|
}
|
||||||
|
|
||||||
|
// Make is a convenience wrapper for Parse that omits the error.
|
||||||
|
// In case of an error, a sensible default is returned.
|
||||||
|
func Make(s string) Tag {
|
||||||
|
t, _ := Parse(s)
|
||||||
|
return t
|
||||||
|
}
|
||||||
|
|
||||||
|
// Raw returns the raw base language, script and region, without making an
|
||||||
|
// attempt to infer their values.
|
||||||
|
// TODO: consider removing
|
||||||
|
func (t Tag) Raw() (b Language, s Script, r Region) {
|
||||||
|
return t.LangID, t.ScriptID, t.RegionID
|
||||||
|
}
|
||||||
|
|
||||||
|
// equalTags compares language, script and region subtags only.
|
||||||
|
func (t Tag) equalTags(a Tag) bool {
|
||||||
|
return t.LangID == a.LangID && t.ScriptID == a.ScriptID && t.RegionID == a.RegionID
|
||||||
|
}
|
||||||
|
|
||||||
|
// IsRoot returns true if t is equal to language "und".
|
||||||
|
func (t Tag) IsRoot() bool {
|
||||||
|
if int(t.pVariant) < len(t.str) {
|
||||||
|
return false
|
||||||
|
}
|
||||||
|
return t.equalTags(Und)
|
||||||
|
}
|
||||||
|
|
||||||
|
// IsPrivateUse reports whether the Tag consists solely of an IsPrivateUse use
|
||||||
|
// tag.
|
||||||
|
func (t Tag) IsPrivateUse() bool {
|
||||||
|
return t.str != "" && t.pVariant == 0
|
||||||
|
}
|
||||||
|
|
||||||
|
// RemakeString is used to update t.str in case lang, script or region changed.
|
||||||
|
// It is assumed that pExt and pVariant still point to the start of the
|
||||||
|
// respective parts.
|
||||||
|
func (t *Tag) RemakeString() {
|
||||||
|
if t.str == "" {
|
||||||
|
return
|
||||||
|
}
|
||||||
|
extra := t.str[t.pVariant:]
|
||||||
|
if t.pVariant > 0 {
|
||||||
|
extra = extra[1:]
|
||||||
|
}
|
||||||
|
if t.equalTags(Und) && strings.HasPrefix(extra, "x-") {
|
||||||
|
t.str = extra
|
||||||
|
t.pVariant = 0
|
||||||
|
t.pExt = 0
|
||||||
|
return
|
||||||
|
}
|
||||||
|
var buf [max99thPercentileSize]byte // avoid extra memory allocation in most cases.
|
||||||
|
b := buf[:t.genCoreBytes(buf[:])]
|
||||||
|
if extra != "" {
|
||||||
|
diff := len(b) - int(t.pVariant)
|
||||||
|
b = append(b, '-')
|
||||||
|
b = append(b, extra...)
|
||||||
|
t.pVariant = uint8(int(t.pVariant) + diff)
|
||||||
|
t.pExt = uint16(int(t.pExt) + diff)
|
||||||
|
} else {
|
||||||
|
t.pVariant = uint8(len(b))
|
||||||
|
t.pExt = uint16(len(b))
|
||||||
|
}
|
||||||
|
t.str = string(b)
|
||||||
|
}
|
||||||
|
|
||||||
|
// genCoreBytes writes a string for the base languages, script and region tags
|
||||||
|
// to the given buffer and returns the number of bytes written. It will never
|
||||||
|
// write more than maxCoreSize bytes.
|
||||||
|
func (t *Tag) genCoreBytes(buf []byte) int {
|
||||||
|
n := t.LangID.StringToBuf(buf[:])
|
||||||
|
if t.ScriptID != 0 {
|
||||||
|
n += copy(buf[n:], "-")
|
||||||
|
n += copy(buf[n:], t.ScriptID.String())
|
||||||
|
}
|
||||||
|
if t.RegionID != 0 {
|
||||||
|
n += copy(buf[n:], "-")
|
||||||
|
n += copy(buf[n:], t.RegionID.String())
|
||||||
|
}
|
||||||
|
return n
|
||||||
|
}
|
||||||
|
|
||||||
|
// String returns the canonical string representation of the language tag.
|
||||||
|
func (t Tag) String() string {
|
||||||
|
if t.str != "" {
|
||||||
|
return t.str
|
||||||
|
}
|
||||||
|
if t.ScriptID == 0 && t.RegionID == 0 {
|
||||||
|
return t.LangID.String()
|
||||||
|
}
|
||||||
|
buf := [maxCoreSize]byte{}
|
||||||
|
return string(buf[:t.genCoreBytes(buf[:])])
|
||||||
|
}
|
||||||
|
|
||||||
|
// MarshalText implements encoding.TextMarshaler.
|
||||||
|
func (t Tag) MarshalText() (text []byte, err error) {
|
||||||
|
if t.str != "" {
|
||||||
|
text = append(text, t.str...)
|
||||||
|
} else if t.ScriptID == 0 && t.RegionID == 0 {
|
||||||
|
text = append(text, t.LangID.String()...)
|
||||||
|
} else {
|
||||||
|
buf := [maxCoreSize]byte{}
|
||||||
|
text = buf[:t.genCoreBytes(buf[:])]
|
||||||
|
}
|
||||||
|
return text, nil
|
||||||
|
}
|
||||||
|
|
||||||
|
// UnmarshalText implements encoding.TextUnmarshaler.
|
||||||
|
func (t *Tag) UnmarshalText(text []byte) error {
|
||||||
|
tag, err := Parse(string(text))
|
||||||
|
*t = tag
|
||||||
|
return err
|
||||||
|
}
|
||||||
|
|
||||||
|
// Variants returns the part of the tag holding all variants or the empty string
|
||||||
|
// if there are no variants defined.
|
||||||
|
func (t Tag) Variants() string {
|
||||||
|
if t.pVariant == 0 {
|
||||||
|
return ""
|
||||||
|
}
|
||||||
|
return t.str[t.pVariant:t.pExt]
|
||||||
|
}
|
||||||
|
|
||||||
|
// VariantOrPrivateUseTags returns variants or private use tags.
|
||||||
|
func (t Tag) VariantOrPrivateUseTags() string {
|
||||||
|
if t.pExt > 0 {
|
||||||
|
return t.str[t.pVariant:t.pExt]
|
||||||
|
}
|
||||||
|
return t.str[t.pVariant:]
|
||||||
|
}
|
||||||
|
|
||||||
|
// HasString reports whether this tag defines more than just the raw
|
||||||
|
// components.
|
||||||
|
func (t Tag) HasString() bool {
|
||||||
|
return t.str != ""
|
||||||
|
}
|
||||||
|
|
||||||
|
// Parent returns the CLDR parent of t. In CLDR, missing fields in data for a
|
||||||
|
// specific language are substituted with fields from the parent language.
|
||||||
|
// The parent for a language may change for newer versions of CLDR.
|
||||||
|
func (t Tag) Parent() Tag {
|
||||||
|
if t.str != "" {
|
||||||
|
// Strip the variants and extensions.
|
||||||
|
b, s, r := t.Raw()
|
||||||
|
t = Tag{LangID: b, ScriptID: s, RegionID: r}
|
||||||
|
if t.RegionID == 0 && t.ScriptID != 0 && t.LangID != 0 {
|
||||||
|
base, _ := addTags(Tag{LangID: t.LangID})
|
||||||
|
if base.ScriptID == t.ScriptID {
|
||||||
|
return Tag{LangID: t.LangID}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
return t
|
||||||
|
}
|
||||||
|
if t.LangID != 0 {
|
||||||
|
if t.RegionID != 0 {
|
||||||
|
maxScript := t.ScriptID
|
||||||
|
if maxScript == 0 {
|
||||||
|
max, _ := addTags(t)
|
||||||
|
maxScript = max.ScriptID
|
||||||
|
}
|
||||||
|
|
||||||
|
for i := range parents {
|
||||||
|
if Language(parents[i].lang) == t.LangID && Script(parents[i].maxScript) == maxScript {
|
||||||
|
for _, r := range parents[i].fromRegion {
|
||||||
|
if Region(r) == t.RegionID {
|
||||||
|
return Tag{
|
||||||
|
LangID: t.LangID,
|
||||||
|
ScriptID: Script(parents[i].script),
|
||||||
|
RegionID: Region(parents[i].toRegion),
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// Strip the script if it is the default one.
|
||||||
|
base, _ := addTags(Tag{LangID: t.LangID})
|
||||||
|
if base.ScriptID != maxScript {
|
||||||
|
return Tag{LangID: t.LangID, ScriptID: maxScript}
|
||||||
|
}
|
||||||
|
return Tag{LangID: t.LangID}
|
||||||
|
} else if t.ScriptID != 0 {
|
||||||
|
// The parent for an base-script pair with a non-default script is
|
||||||
|
// "und" instead of the base language.
|
||||||
|
base, _ := addTags(Tag{LangID: t.LangID})
|
||||||
|
if base.ScriptID != t.ScriptID {
|
||||||
|
return Und
|
||||||
|
}
|
||||||
|
return Tag{LangID: t.LangID}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
return Und
|
||||||
|
}
|
||||||
|
|
||||||
|
// ParseExtension parses s as an extension and returns it on success.
|
||||||
|
func ParseExtension(s string) (ext string, err error) {
|
||||||
|
scan := makeScannerString(s)
|
||||||
|
var end int
|
||||||
|
if n := len(scan.token); n != 1 {
|
||||||
|
return "", ErrSyntax
|
||||||
|
}
|
||||||
|
scan.toLower(0, len(scan.b))
|
||||||
|
end = parseExtension(&scan)
|
||||||
|
if end != len(s) {
|
||||||
|
return "", ErrSyntax
|
||||||
|
}
|
||||||
|
return string(scan.b), nil
|
||||||
|
}
|
||||||
|
|
||||||
|
// HasVariants reports whether t has variants.
|
||||||
|
func (t Tag) HasVariants() bool {
|
||||||
|
return uint16(t.pVariant) < t.pExt
|
||||||
|
}
|
||||||
|
|
||||||
|
// HasExtensions reports whether t has extensions.
|
||||||
|
func (t Tag) HasExtensions() bool {
|
||||||
|
return int(t.pExt) < len(t.str)
|
||||||
|
}
|
||||||
|
|
||||||
|
// Extension returns the extension of type x for tag t. It will return
|
||||||
|
// false for ok if t does not have the requested extension. The returned
|
||||||
|
// extension will be invalid in this case.
|
||||||
|
func (t Tag) Extension(x byte) (ext string, ok bool) {
|
||||||
|
for i := int(t.pExt); i < len(t.str)-1; {
|
||||||
|
var ext string
|
||||||
|
i, ext = getExtension(t.str, i)
|
||||||
|
if ext[0] == x {
|
||||||
|
return ext, true
|
||||||
|
}
|
||||||
|
}
|
||||||
|
return "", false
|
||||||
|
}
|
||||||
|
|
||||||
|
// Extensions returns all extensions of t.
|
||||||
|
func (t Tag) Extensions() []string {
|
||||||
|
e := []string{}
|
||||||
|
for i := int(t.pExt); i < len(t.str)-1; {
|
||||||
|
var ext string
|
||||||
|
i, ext = getExtension(t.str, i)
|
||||||
|
e = append(e, ext)
|
||||||
|
}
|
||||||
|
return e
|
||||||
|
}
|
||||||
|
|
||||||
|
// TypeForKey returns the type associated with the given key, where key and type
|
||||||
|
// are of the allowed values defined for the Unicode locale extension ('u') in
|
||||||
|
// https://www.unicode.org/reports/tr35/#Unicode_Language_and_Locale_Identifiers.
|
||||||
|
// TypeForKey will traverse the inheritance chain to get the correct value.
|
||||||
|
func (t Tag) TypeForKey(key string) string {
|
||||||
|
if start, end, _ := t.findTypeForKey(key); end != start {
|
||||||
|
return t.str[start:end]
|
||||||
|
}
|
||||||
|
return ""
|
||||||
|
}
|
||||||
|
|
||||||
|
var (
|
||||||
|
errPrivateUse = errors.New("cannot set a key on a private use tag")
|
||||||
|
errInvalidArguments = errors.New("invalid key or type")
|
||||||
|
)
|
||||||
|
|
||||||
|
// SetTypeForKey returns a new Tag with the key set to type, where key and type
|
||||||
|
// are of the allowed values defined for the Unicode locale extension ('u') in
|
||||||
|
// https://www.unicode.org/reports/tr35/#Unicode_Language_and_Locale_Identifiers.
|
||||||
|
// An empty value removes an existing pair with the same key.
|
||||||
|
func (t Tag) SetTypeForKey(key, value string) (Tag, error) {
|
||||||
|
if t.IsPrivateUse() {
|
||||||
|
return t, errPrivateUse
|
||||||
|
}
|
||||||
|
if len(key) != 2 {
|
||||||
|
return t, errInvalidArguments
|
||||||
|
}
|
||||||
|
|
||||||
|
// Remove the setting if value is "".
|
||||||
|
if value == "" {
|
||||||
|
start, end, _ := t.findTypeForKey(key)
|
||||||
|
if start != end {
|
||||||
|
// Remove key tag and leading '-'.
|
||||||
|
start -= 4
|
||||||
|
|
||||||
|
// Remove a possible empty extension.
|
||||||
|
if (end == len(t.str) || t.str[end+2] == '-') && t.str[start-2] == '-' {
|
||||||
|
start -= 2
|
||||||
|
}
|
||||||
|
if start == int(t.pVariant) && end == len(t.str) {
|
||||||
|
t.str = ""
|
||||||
|
t.pVariant, t.pExt = 0, 0
|
||||||
|
} else {
|
||||||
|
t.str = fmt.Sprintf("%s%s", t.str[:start], t.str[end:])
|
||||||
|
}
|
||||||
|
}
|
||||||
|
return t, nil
|
||||||
|
}
|
||||||
|
|
||||||
|
if len(value) < 3 || len(value) > 8 {
|
||||||
|
return t, errInvalidArguments
|
||||||
|
}
|
||||||
|
|
||||||
|
var (
|
||||||
|
buf [maxCoreSize + maxSimpleUExtensionSize]byte
|
||||||
|
uStart int // start of the -u extension.
|
||||||
|
)
|
||||||
|
|
||||||
|
// Generate the tag string if needed.
|
||||||
|
if t.str == "" {
|
||||||
|
uStart = t.genCoreBytes(buf[:])
|
||||||
|
buf[uStart] = '-'
|
||||||
|
uStart++
|
||||||
|
}
|
||||||
|
|
||||||
|
// Create new key-type pair and parse it to verify.
|
||||||
|
b := buf[uStart:]
|
||||||
|
copy(b, "u-")
|
||||||
|
copy(b[2:], key)
|
||||||
|
b[4] = '-'
|
||||||
|
b = b[:5+copy(b[5:], value)]
|
||||||
|
scan := makeScanner(b)
|
||||||
|
if parseExtensions(&scan); scan.err != nil {
|
||||||
|
return t, scan.err
|
||||||
|
}
|
||||||
|
|
||||||
|
// Assemble the replacement string.
|
||||||
|
if t.str == "" {
|
||||||
|
t.pVariant, t.pExt = byte(uStart-1), uint16(uStart-1)
|
||||||
|
t.str = string(buf[:uStart+len(b)])
|
||||||
|
} else {
|
||||||
|
s := t.str
|
||||||
|
start, end, hasExt := t.findTypeForKey(key)
|
||||||
|
if start == end {
|
||||||
|
if hasExt {
|
||||||
|
b = b[2:]
|
||||||
|
}
|
||||||
|
t.str = fmt.Sprintf("%s-%s%s", s[:start], b, s[end:])
|
||||||
|
} else {
|
||||||
|
t.str = fmt.Sprintf("%s%s%s", s[:start], value, s[end:])
|
||||||
|
}
|
||||||
|
}
|
||||||
|
return t, nil
|
||||||
|
}
|
||||||
|
|
||||||
|
// findKeyAndType returns the start and end position for the type corresponding
|
||||||
|
// to key or the point at which to insert the key-value pair if the type
|
||||||
|
// wasn't found. The hasExt return value reports whether an -u extension was present.
|
||||||
|
// Note: the extensions are typically very small and are likely to contain
|
||||||
|
// only one key-type pair.
|
||||||
|
func (t Tag) findTypeForKey(key string) (start, end int, hasExt bool) {
|
||||||
|
p := int(t.pExt)
|
||||||
|
if len(key) != 2 || p == len(t.str) || p == 0 {
|
||||||
|
return p, p, false
|
||||||
|
}
|
||||||
|
s := t.str
|
||||||
|
|
||||||
|
// Find the correct extension.
|
||||||
|
for p++; s[p] != 'u'; p++ {
|
||||||
|
if s[p] > 'u' {
|
||||||
|
p--
|
||||||
|
return p, p, false
|
||||||
|
}
|
||||||
|
if p = nextExtension(s, p); p == len(s) {
|
||||||
|
return len(s), len(s), false
|
||||||
|
}
|
||||||
|
}
|
||||||
|
// Proceed to the hyphen following the extension name.
|
||||||
|
p++
|
||||||
|
|
||||||
|
// curKey is the key currently being processed.
|
||||||
|
curKey := ""
|
||||||
|
|
||||||
|
// Iterate over keys until we get the end of a section.
|
||||||
|
for {
|
||||||
|
// p points to the hyphen preceding the current token.
|
||||||
|
if p3 := p + 3; s[p3] == '-' {
|
||||||
|
// Found a key.
|
||||||
|
// Check whether we just processed the key that was requested.
|
||||||
|
if curKey == key {
|
||||||
|
return start, p, true
|
||||||
|
}
|
||||||
|
// Set to the next key and continue scanning type tokens.
|
||||||
|
curKey = s[p+1 : p3]
|
||||||
|
if curKey > key {
|
||||||
|
return p, p, true
|
||||||
|
}
|
||||||
|
// Start of the type token sequence.
|
||||||
|
start = p + 4
|
||||||
|
// A type is at least 3 characters long.
|
||||||
|
p += 7 // 4 + 3
|
||||||
|
} else {
|
||||||
|
// Attribute or type, which is at least 3 characters long.
|
||||||
|
p += 4
|
||||||
|
}
|
||||||
|
// p points past the third character of a type or attribute.
|
||||||
|
max := p + 5 // maximum length of token plus hyphen.
|
||||||
|
if len(s) < max {
|
||||||
|
max = len(s)
|
||||||
|
}
|
||||||
|
for ; p < max && s[p] != '-'; p++ {
|
||||||
|
}
|
||||||
|
// Bail if we have exhausted all tokens or if the next token starts
|
||||||
|
// a new extension.
|
||||||
|
if p == len(s) || s[p+2] == '-' {
|
||||||
|
if curKey == key {
|
||||||
|
return start, p, true
|
||||||
|
}
|
||||||
|
return p, p, true
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// ParseBase parses a 2- or 3-letter ISO 639 code.
|
||||||
|
// It returns a ValueError if s is a well-formed but unknown language identifier
|
||||||
|
// or another error if another error occurred.
|
||||||
|
func ParseBase(s string) (Language, error) {
|
||||||
|
if n := len(s); n < 2 || 3 < n {
|
||||||
|
return 0, ErrSyntax
|
||||||
|
}
|
||||||
|
var buf [3]byte
|
||||||
|
return getLangID(buf[:copy(buf[:], s)])
|
||||||
|
}
|
||||||
|
|
||||||
|
// ParseScript parses a 4-letter ISO 15924 code.
|
||||||
|
// It returns a ValueError if s is a well-formed but unknown script identifier
|
||||||
|
// or another error if another error occurred.
|
||||||
|
func ParseScript(s string) (Script, error) {
|
||||||
|
if len(s) != 4 {
|
||||||
|
return 0, ErrSyntax
|
||||||
|
}
|
||||||
|
var buf [4]byte
|
||||||
|
return getScriptID(script, buf[:copy(buf[:], s)])
|
||||||
|
}
|
||||||
|
|
||||||
|
// EncodeM49 returns the Region for the given UN M.49 code.
|
||||||
|
// It returns an error if r is not a valid code.
|
||||||
|
func EncodeM49(r int) (Region, error) {
|
||||||
|
return getRegionM49(r)
|
||||||
|
}
|
||||||
|
|
||||||
|
// ParseRegion parses a 2- or 3-letter ISO 3166-1 or a UN M.49 code.
|
||||||
|
// It returns a ValueError if s is a well-formed but unknown region identifier
|
||||||
|
// or another error if another error occurred.
|
||||||
|
func ParseRegion(s string) (Region, error) {
|
||||||
|
if n := len(s); n < 2 || 3 < n {
|
||||||
|
return 0, ErrSyntax
|
||||||
|
}
|
||||||
|
var buf [3]byte
|
||||||
|
return getRegionID(buf[:copy(buf[:], s)])
|
||||||
|
}
|
||||||
|
|
||||||
|
// IsCountry returns whether this region is a country or autonomous area. This
|
||||||
|
// includes non-standard definitions from CLDR.
|
||||||
|
func (r Region) IsCountry() bool {
|
||||||
|
if r == 0 || r.IsGroup() || r.IsPrivateUse() && r != _XK {
|
||||||
|
return false
|
||||||
|
}
|
||||||
|
return true
|
||||||
|
}
|
||||||
|
|
||||||
|
// IsGroup returns whether this region defines a collection of regions. This
|
||||||
|
// includes non-standard definitions from CLDR.
|
||||||
|
func (r Region) IsGroup() bool {
|
||||||
|
if r == 0 {
|
||||||
|
return false
|
||||||
|
}
|
||||||
|
return int(regionInclusion[r]) < len(regionContainment)
|
||||||
|
}
|
||||||
|
|
||||||
|
// Contains returns whether Region c is contained by Region r. It returns true
|
||||||
|
// if c == r.
|
||||||
|
func (r Region) Contains(c Region) bool {
|
||||||
|
if r == c {
|
||||||
|
return true
|
||||||
|
}
|
||||||
|
g := regionInclusion[r]
|
||||||
|
if g >= nRegionGroups {
|
||||||
|
return false
|
||||||
|
}
|
||||||
|
m := regionContainment[g]
|
||||||
|
|
||||||
|
d := regionInclusion[c]
|
||||||
|
b := regionInclusionBits[d]
|
||||||
|
|
||||||
|
// A contained country may belong to multiple disjoint groups. Matching any
|
||||||
|
// of these indicates containment. If the contained region is a group, it
|
||||||
|
// must strictly be a subset.
|
||||||
|
if d >= nRegionGroups {
|
||||||
|
return b&m != 0
|
||||||
|
}
|
||||||
|
return b&^m == 0
|
||||||
|
}
|
||||||
|
|
||||||
|
var errNoTLD = errors.New("language: region is not a valid ccTLD")
|
||||||
|
|
||||||
|
// TLD returns the country code top-level domain (ccTLD). UK is returned for GB.
|
||||||
|
// In all other cases it returns either the region itself or an error.
|
||||||
|
//
|
||||||
|
// This method may return an error for a region for which there exists a
|
||||||
|
// canonical form with a ccTLD. To get that ccTLD canonicalize r first. The
|
||||||
|
// region will already be canonicalized it was obtained from a Tag that was
|
||||||
|
// obtained using any of the default methods.
|
||||||
|
func (r Region) TLD() (Region, error) {
|
||||||
|
// See http://en.wikipedia.org/wiki/Country_code_top-level_domain for the
|
||||||
|
// difference between ISO 3166-1 and IANA ccTLD.
|
||||||
|
if r == _GB {
|
||||||
|
r = _UK
|
||||||
|
}
|
||||||
|
if (r.typ() & ccTLD) == 0 {
|
||||||
|
return 0, errNoTLD
|
||||||
|
}
|
||||||
|
return r, nil
|
||||||
|
}
|
||||||
|
|
||||||
|
// Canonicalize returns the region or a possible replacement if the region is
|
||||||
|
// deprecated. It will not return a replacement for deprecated regions that
|
||||||
|
// are split into multiple regions.
|
||||||
|
func (r Region) Canonicalize() Region {
|
||||||
|
if cr := normRegion(r); cr != 0 {
|
||||||
|
return cr
|
||||||
|
}
|
||||||
|
return r
|
||||||
|
}
|
||||||
|
|
||||||
|
// Variant represents a registered variant of a language as defined by BCP 47.
|
||||||
|
type Variant struct {
|
||||||
|
ID uint8
|
||||||
|
str string
|
||||||
|
}
|
||||||
|
|
||||||
|
// ParseVariant parses and returns a Variant. An error is returned if s is not
|
||||||
|
// a valid variant.
|
||||||
|
func ParseVariant(s string) (Variant, error) {
|
||||||
|
s = strings.ToLower(s)
|
||||||
|
if id, ok := variantIndex[s]; ok {
|
||||||
|
return Variant{id, s}, nil
|
||||||
|
}
|
||||||
|
return Variant{}, NewValueError([]byte(s))
|
||||||
|
}
|
||||||
|
|
||||||
|
// String returns the string representation of the variant.
|
||||||
|
func (v Variant) String() string {
|
||||||
|
return v.str
|
||||||
|
}
|
412
vendor/golang.org/x/text/internal/language/lookup.go
generated
vendored
Normal file
412
vendor/golang.org/x/text/internal/language/lookup.go
generated
vendored
Normal file
|
@ -0,0 +1,412 @@
|
||||||
|
// Copyright 2013 The Go Authors. All rights reserved.
|
||||||
|
// Use of this source code is governed by a BSD-style
|
||||||
|
// license that can be found in the LICENSE file.
|
||||||
|
|
||||||
|
package language
|
||||||
|
|
||||||
|
import (
|
||||||
|
"bytes"
|
||||||
|
"fmt"
|
||||||
|
"sort"
|
||||||
|
"strconv"
|
||||||
|
|
||||||
|
"golang.org/x/text/internal/tag"
|
||||||
|
)
|
||||||
|
|
||||||
|
// findIndex tries to find the given tag in idx and returns a standardized error
|
||||||
|
// if it could not be found.
|
||||||
|
func findIndex(idx tag.Index, key []byte, form string) (index int, err error) {
|
||||||
|
if !tag.FixCase(form, key) {
|
||||||
|
return 0, ErrSyntax
|
||||||
|
}
|
||||||
|
i := idx.Index(key)
|
||||||
|
if i == -1 {
|
||||||
|
return 0, NewValueError(key)
|
||||||
|
}
|
||||||
|
return i, nil
|
||||||
|
}
|
||||||
|
|
||||||
|
func searchUint(imap []uint16, key uint16) int {
|
||||||
|
return sort.Search(len(imap), func(i int) bool {
|
||||||
|
return imap[i] >= key
|
||||||
|
})
|
||||||
|
}
|
||||||
|
|
||||||
|
type Language uint16
|
||||||
|
|
||||||
|
// getLangID returns the langID of s if s is a canonical subtag
|
||||||
|
// or langUnknown if s is not a canonical subtag.
|
||||||
|
func getLangID(s []byte) (Language, error) {
|
||||||
|
if len(s) == 2 {
|
||||||
|
return getLangISO2(s)
|
||||||
|
}
|
||||||
|
return getLangISO3(s)
|
||||||
|
}
|
||||||
|
|
||||||
|
// TODO language normalization as well as the AliasMaps could be moved to the
|
||||||
|
// higher level package, but it is a bit tricky to separate the generation.
|
||||||
|
|
||||||
|
func (id Language) Canonicalize() (Language, AliasType) {
|
||||||
|
return normLang(id)
|
||||||
|
}
|
||||||
|
|
||||||
|
// mapLang returns the mapped langID of id according to mapping m.
|
||||||
|
func normLang(id Language) (Language, AliasType) {
|
||||||
|
k := sort.Search(len(AliasMap), func(i int) bool {
|
||||||
|
return AliasMap[i].From >= uint16(id)
|
||||||
|
})
|
||||||
|
if k < len(AliasMap) && AliasMap[k].From == uint16(id) {
|
||||||
|
return Language(AliasMap[k].To), AliasTypes[k]
|
||||||
|
}
|
||||||
|
return id, AliasTypeUnknown
|
||||||
|
}
|
||||||
|
|
||||||
|
// getLangISO2 returns the langID for the given 2-letter ISO language code
|
||||||
|
// or unknownLang if this does not exist.
|
||||||
|
func getLangISO2(s []byte) (Language, error) {
|
||||||
|
if !tag.FixCase("zz", s) {
|
||||||
|
return 0, ErrSyntax
|
||||||
|
}
|
||||||
|
if i := lang.Index(s); i != -1 && lang.Elem(i)[3] != 0 {
|
||||||
|
return Language(i), nil
|
||||||
|
}
|
||||||
|
return 0, NewValueError(s)
|
||||||
|
}
|
||||||
|
|
||||||
|
const base = 'z' - 'a' + 1
|
||||||
|
|
||||||
|
func strToInt(s []byte) uint {
|
||||||
|
v := uint(0)
|
||||||
|
for i := 0; i < len(s); i++ {
|
||||||
|
v *= base
|
||||||
|
v += uint(s[i] - 'a')
|
||||||
|
}
|
||||||
|
return v
|
||||||
|
}
|
||||||
|
|
||||||
|
// converts the given integer to the original ASCII string passed to strToInt.
|
||||||
|
// len(s) must match the number of characters obtained.
|
||||||
|
func intToStr(v uint, s []byte) {
|
||||||
|
for i := len(s) - 1; i >= 0; i-- {
|
||||||
|
s[i] = byte(v%base) + 'a'
|
||||||
|
v /= base
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// getLangISO3 returns the langID for the given 3-letter ISO language code
|
||||||
|
// or unknownLang if this does not exist.
|
||||||
|
func getLangISO3(s []byte) (Language, error) {
|
||||||
|
if tag.FixCase("und", s) {
|
||||||
|
// first try to match canonical 3-letter entries
|
||||||
|
for i := lang.Index(s[:2]); i != -1; i = lang.Next(s[:2], i) {
|
||||||
|
if e := lang.Elem(i); e[3] == 0 && e[2] == s[2] {
|
||||||
|
// We treat "und" as special and always translate it to "unspecified".
|
||||||
|
// Note that ZZ and Zzzz are private use and are not treated as
|
||||||
|
// unspecified by default.
|
||||||
|
id := Language(i)
|
||||||
|
if id == nonCanonicalUnd {
|
||||||
|
return 0, nil
|
||||||
|
}
|
||||||
|
return id, nil
|
||||||
|
}
|
||||||
|
}
|
||||||
|
if i := altLangISO3.Index(s); i != -1 {
|
||||||
|
return Language(altLangIndex[altLangISO3.Elem(i)[3]]), nil
|
||||||
|
}
|
||||||
|
n := strToInt(s)
|
||||||
|
if langNoIndex[n/8]&(1<<(n%8)) != 0 {
|
||||||
|
return Language(n) + langNoIndexOffset, nil
|
||||||
|
}
|
||||||
|
// Check for non-canonical uses of ISO3.
|
||||||
|
for i := lang.Index(s[:1]); i != -1; i = lang.Next(s[:1], i) {
|
||||||
|
if e := lang.Elem(i); e[2] == s[1] && e[3] == s[2] {
|
||||||
|
return Language(i), nil
|
||||||
|
}
|
||||||
|
}
|
||||||
|
return 0, NewValueError(s)
|
||||||
|
}
|
||||||
|
return 0, ErrSyntax
|
||||||
|
}
|
||||||
|
|
||||||
|
// StringToBuf writes the string to b and returns the number of bytes
|
||||||
|
// written. cap(b) must be >= 3.
|
||||||
|
func (id Language) StringToBuf(b []byte) int {
|
||||||
|
if id >= langNoIndexOffset {
|
||||||
|
intToStr(uint(id)-langNoIndexOffset, b[:3])
|
||||||
|
return 3
|
||||||
|
} else if id == 0 {
|
||||||
|
return copy(b, "und")
|
||||||
|
}
|
||||||
|
l := lang[id<<2:]
|
||||||
|
if l[3] == 0 {
|
||||||
|
return copy(b, l[:3])
|
||||||
|
}
|
||||||
|
return copy(b, l[:2])
|
||||||
|
}
|
||||||
|
|
||||||
|
// String returns the BCP 47 representation of the langID.
|
||||||
|
// Use b as variable name, instead of id, to ensure the variable
|
||||||
|
// used is consistent with that of Base in which this type is embedded.
|
||||||
|
func (b Language) String() string {
|
||||||
|
if b == 0 {
|
||||||
|
return "und"
|
||||||
|
} else if b >= langNoIndexOffset {
|
||||||
|
b -= langNoIndexOffset
|
||||||
|
buf := [3]byte{}
|
||||||
|
intToStr(uint(b), buf[:])
|
||||||
|
return string(buf[:])
|
||||||
|
}
|
||||||
|
l := lang.Elem(int(b))
|
||||||
|
if l[3] == 0 {
|
||||||
|
return l[:3]
|
||||||
|
}
|
||||||
|
return l[:2]
|
||||||
|
}
|
||||||
|
|
||||||
|
// ISO3 returns the ISO 639-3 language code.
|
||||||
|
func (b Language) ISO3() string {
|
||||||
|
if b == 0 || b >= langNoIndexOffset {
|
||||||
|
return b.String()
|
||||||
|
}
|
||||||
|
l := lang.Elem(int(b))
|
||||||
|
if l[3] == 0 {
|
||||||
|
return l[:3]
|
||||||
|
} else if l[2] == 0 {
|
||||||
|
return altLangISO3.Elem(int(l[3]))[:3]
|
||||||
|
}
|
||||||
|
// This allocation will only happen for 3-letter ISO codes
|
||||||
|
// that are non-canonical BCP 47 language identifiers.
|
||||||
|
return l[0:1] + l[2:4]
|
||||||
|
}
|
||||||
|
|
||||||
|
// IsPrivateUse reports whether this language code is reserved for private use.
|
||||||
|
func (b Language) IsPrivateUse() bool {
|
||||||
|
return langPrivateStart <= b && b <= langPrivateEnd
|
||||||
|
}
|
||||||
|
|
||||||
|
// SuppressScript returns the script marked as SuppressScript in the IANA
|
||||||
|
// language tag repository, or 0 if there is no such script.
|
||||||
|
func (b Language) SuppressScript() Script {
|
||||||
|
if b < langNoIndexOffset {
|
||||||
|
return Script(suppressScript[b])
|
||||||
|
}
|
||||||
|
return 0
|
||||||
|
}
|
||||||
|
|
||||||
|
type Region uint16
|
||||||
|
|
||||||
|
// getRegionID returns the region id for s if s is a valid 2-letter region code
|
||||||
|
// or unknownRegion.
|
||||||
|
func getRegionID(s []byte) (Region, error) {
|
||||||
|
if len(s) == 3 {
|
||||||
|
if isAlpha(s[0]) {
|
||||||
|
return getRegionISO3(s)
|
||||||
|
}
|
||||||
|
if i, err := strconv.ParseUint(string(s), 10, 10); err == nil {
|
||||||
|
return getRegionM49(int(i))
|
||||||
|
}
|
||||||
|
}
|
||||||
|
return getRegionISO2(s)
|
||||||
|
}
|
||||||
|
|
||||||
|
// getRegionISO2 returns the regionID for the given 2-letter ISO country code
|
||||||
|
// or unknownRegion if this does not exist.
|
||||||
|
func getRegionISO2(s []byte) (Region, error) {
|
||||||
|
i, err := findIndex(regionISO, s, "ZZ")
|
||||||
|
if err != nil {
|
||||||
|
return 0, err
|
||||||
|
}
|
||||||
|
return Region(i) + isoRegionOffset, nil
|
||||||
|
}
|
||||||
|
|
||||||
|
// getRegionISO3 returns the regionID for the given 3-letter ISO country code
|
||||||
|
// or unknownRegion if this does not exist.
|
||||||
|
func getRegionISO3(s []byte) (Region, error) {
|
||||||
|
if tag.FixCase("ZZZ", s) {
|
||||||
|
for i := regionISO.Index(s[:1]); i != -1; i = regionISO.Next(s[:1], i) {
|
||||||
|
if e := regionISO.Elem(i); e[2] == s[1] && e[3] == s[2] {
|
||||||
|
return Region(i) + isoRegionOffset, nil
|
||||||
|
}
|
||||||
|
}
|
||||||
|
for i := 0; i < len(altRegionISO3); i += 3 {
|
||||||
|
if tag.Compare(altRegionISO3[i:i+3], s) == 0 {
|
||||||
|
return Region(altRegionIDs[i/3]), nil
|
||||||
|
}
|
||||||
|
}
|
||||||
|
return 0, NewValueError(s)
|
||||||
|
}
|
||||||
|
return 0, ErrSyntax
|
||||||
|
}
|
||||||
|
|
||||||
|
func getRegionM49(n int) (Region, error) {
|
||||||
|
if 0 < n && n <= 999 {
|
||||||
|
const (
|
||||||
|
searchBits = 7
|
||||||
|
regionBits = 9
|
||||||
|
regionMask = 1<<regionBits - 1
|
||||||
|
)
|
||||||
|
idx := n >> searchBits
|
||||||
|
buf := fromM49[m49Index[idx]:m49Index[idx+1]]
|
||||||
|
val := uint16(n) << regionBits // we rely on bits shifting out
|
||||||
|
i := sort.Search(len(buf), func(i int) bool {
|
||||||
|
return buf[i] >= val
|
||||||
|
})
|
||||||
|
if r := fromM49[int(m49Index[idx])+i]; r&^regionMask == val {
|
||||||
|
return Region(r & regionMask), nil
|
||||||
|
}
|
||||||
|
}
|
||||||
|
var e ValueError
|
||||||
|
fmt.Fprint(bytes.NewBuffer([]byte(e.v[:])), n)
|
||||||
|
return 0, e
|
||||||
|
}
|
||||||
|
|
||||||
|
// normRegion returns a region if r is deprecated or 0 otherwise.
|
||||||
|
// TODO: consider supporting BYS (-> BLR), CSK (-> 200 or CZ), PHI (-> PHL) and AFI (-> DJ).
|
||||||
|
// TODO: consider mapping split up regions to new most populous one (like CLDR).
|
||||||
|
func normRegion(r Region) Region {
|
||||||
|
m := regionOldMap
|
||||||
|
k := sort.Search(len(m), func(i int) bool {
|
||||||
|
return m[i].From >= uint16(r)
|
||||||
|
})
|
||||||
|
if k < len(m) && m[k].From == uint16(r) {
|
||||||
|
return Region(m[k].To)
|
||||||
|
}
|
||||||
|
return 0
|
||||||
|
}
|
||||||
|
|
||||||
|
const (
|
||||||
|
iso3166UserAssigned = 1 << iota
|
||||||
|
ccTLD
|
||||||
|
bcp47Region
|
||||||
|
)
|
||||||
|
|
||||||
|
func (r Region) typ() byte {
|
||||||
|
return regionTypes[r]
|
||||||
|
}
|
||||||
|
|
||||||
|
// String returns the BCP 47 representation for the region.
|
||||||
|
// It returns "ZZ" for an unspecified region.
|
||||||
|
func (r Region) String() string {
|
||||||
|
if r < isoRegionOffset {
|
||||||
|
if r == 0 {
|
||||||
|
return "ZZ"
|
||||||
|
}
|
||||||
|
return fmt.Sprintf("%03d", r.M49())
|
||||||
|
}
|
||||||
|
r -= isoRegionOffset
|
||||||
|
return regionISO.Elem(int(r))[:2]
|
||||||
|
}
|
||||||
|
|
||||||
|
// ISO3 returns the 3-letter ISO code of r.
|
||||||
|
// Note that not all regions have a 3-letter ISO code.
|
||||||
|
// In such cases this method returns "ZZZ".
|
||||||
|
func (r Region) ISO3() string {
|
||||||
|
if r < isoRegionOffset {
|
||||||
|
return "ZZZ"
|
||||||
|
}
|
||||||
|
r -= isoRegionOffset
|
||||||
|
reg := regionISO.Elem(int(r))
|
||||||
|
switch reg[2] {
|
||||||
|
case 0:
|
||||||
|
return altRegionISO3[reg[3]:][:3]
|
||||||
|
case ' ':
|
||||||
|
return "ZZZ"
|
||||||
|
}
|
||||||
|
return reg[0:1] + reg[2:4]
|
||||||
|
}
|
||||||
|
|
||||||
|
// M49 returns the UN M.49 encoding of r, or 0 if this encoding
|
||||||
|
// is not defined for r.
|
||||||
|
func (r Region) M49() int {
|
||||||
|
return int(m49[r])
|
||||||
|
}
|
||||||
|
|
||||||
|
// IsPrivateUse reports whether r has the ISO 3166 User-assigned status. This
|
||||||
|
// may include private-use tags that are assigned by CLDR and used in this
|
||||||
|
// implementation. So IsPrivateUse and IsCountry can be simultaneously true.
|
||||||
|
func (r Region) IsPrivateUse() bool {
|
||||||
|
return r.typ()&iso3166UserAssigned != 0
|
||||||
|
}
|
||||||
|
|
||||||
|
type Script uint8
|
||||||
|
|
||||||
|
// getScriptID returns the script id for string s. It assumes that s
|
||||||
|
// is of the format [A-Z][a-z]{3}.
|
||||||
|
func getScriptID(idx tag.Index, s []byte) (Script, error) {
|
||||||
|
i, err := findIndex(idx, s, "Zzzz")
|
||||||
|
return Script(i), err
|
||||||
|
}
|
||||||
|
|
||||||
|
// String returns the script code in title case.
|
||||||
|
// It returns "Zzzz" for an unspecified script.
|
||||||
|
func (s Script) String() string {
|
||||||
|
if s == 0 {
|
||||||
|
return "Zzzz"
|
||||||
|
}
|
||||||
|
return script.Elem(int(s))
|
||||||
|
}
|
||||||
|
|
||||||
|
// IsPrivateUse reports whether this script code is reserved for private use.
|
||||||
|
func (s Script) IsPrivateUse() bool {
|
||||||
|
return _Qaaa <= s && s <= _Qabx
|
||||||
|
}
|
||||||
|
|
||||||
|
const (
|
||||||
|
maxAltTaglen = len("en-US-POSIX")
|
||||||
|
maxLen = maxAltTaglen
|
||||||
|
)
|
||||||
|
|
||||||
|
var (
|
||||||
|
// grandfatheredMap holds a mapping from legacy and grandfathered tags to
|
||||||
|
// their base language or index to more elaborate tag.
|
||||||
|
grandfatheredMap = map[[maxLen]byte]int16{
|
||||||
|
[maxLen]byte{'a', 'r', 't', '-', 'l', 'o', 'j', 'b', 'a', 'n'}: _jbo, // art-lojban
|
||||||
|
[maxLen]byte{'i', '-', 'a', 'm', 'i'}: _ami, // i-ami
|
||||||
|
[maxLen]byte{'i', '-', 'b', 'n', 'n'}: _bnn, // i-bnn
|
||||||
|
[maxLen]byte{'i', '-', 'h', 'a', 'k'}: _hak, // i-hak
|
||||||
|
[maxLen]byte{'i', '-', 'k', 'l', 'i', 'n', 'g', 'o', 'n'}: _tlh, // i-klingon
|
||||||
|
[maxLen]byte{'i', '-', 'l', 'u', 'x'}: _lb, // i-lux
|
||||||
|
[maxLen]byte{'i', '-', 'n', 'a', 'v', 'a', 'j', 'o'}: _nv, // i-navajo
|
||||||
|
[maxLen]byte{'i', '-', 'p', 'w', 'n'}: _pwn, // i-pwn
|
||||||
|
[maxLen]byte{'i', '-', 't', 'a', 'o'}: _tao, // i-tao
|
||||||
|
[maxLen]byte{'i', '-', 't', 'a', 'y'}: _tay, // i-tay
|
||||||
|
[maxLen]byte{'i', '-', 't', 's', 'u'}: _tsu, // i-tsu
|
||||||
|
[maxLen]byte{'n', 'o', '-', 'b', 'o', 'k'}: _nb, // no-bok
|
||||||
|
[maxLen]byte{'n', 'o', '-', 'n', 'y', 'n'}: _nn, // no-nyn
|
||||||
|
[maxLen]byte{'s', 'g', 'n', '-', 'b', 'e', '-', 'f', 'r'}: _sfb, // sgn-BE-FR
|
||||||
|
[maxLen]byte{'s', 'g', 'n', '-', 'b', 'e', '-', 'n', 'l'}: _vgt, // sgn-BE-NL
|
||||||
|
[maxLen]byte{'s', 'g', 'n', '-', 'c', 'h', '-', 'd', 'e'}: _sgg, // sgn-CH-DE
|
||||||
|
[maxLen]byte{'z', 'h', '-', 'g', 'u', 'o', 'y', 'u'}: _cmn, // zh-guoyu
|
||||||
|
[maxLen]byte{'z', 'h', '-', 'h', 'a', 'k', 'k', 'a'}: _hak, // zh-hakka
|
||||||
|
[maxLen]byte{'z', 'h', '-', 'm', 'i', 'n', '-', 'n', 'a', 'n'}: _nan, // zh-min-nan
|
||||||
|
[maxLen]byte{'z', 'h', '-', 'x', 'i', 'a', 'n', 'g'}: _hsn, // zh-xiang
|
||||||
|
|
||||||
|
// Grandfathered tags with no modern replacement will be converted as
|
||||||
|
// follows:
|
||||||
|
[maxLen]byte{'c', 'e', 'l', '-', 'g', 'a', 'u', 'l', 'i', 's', 'h'}: -1, // cel-gaulish
|
||||||
|
[maxLen]byte{'e', 'n', '-', 'g', 'b', '-', 'o', 'e', 'd'}: -2, // en-GB-oed
|
||||||
|
[maxLen]byte{'i', '-', 'd', 'e', 'f', 'a', 'u', 'l', 't'}: -3, // i-default
|
||||||
|
[maxLen]byte{'i', '-', 'e', 'n', 'o', 'c', 'h', 'i', 'a', 'n'}: -4, // i-enochian
|
||||||
|
[maxLen]byte{'i', '-', 'm', 'i', 'n', 'g', 'o'}: -5, // i-mingo
|
||||||
|
[maxLen]byte{'z', 'h', '-', 'm', 'i', 'n'}: -6, // zh-min
|
||||||
|
|
||||||
|
// CLDR-specific tag.
|
||||||
|
[maxLen]byte{'r', 'o', 'o', 't'}: 0, // root
|
||||||
|
[maxLen]byte{'e', 'n', '-', 'u', 's', '-', 'p', 'o', 's', 'i', 'x'}: -7, // en_US_POSIX"
|
||||||
|
}
|
||||||
|
|
||||||
|
altTagIndex = [...]uint8{0, 17, 31, 45, 61, 74, 86, 102}
|
||||||
|
|
||||||
|
altTags = "xtg-x-cel-gaulishen-GB-oxendicten-x-i-defaultund-x-i-enochiansee-x-i-mingonan-x-zh-minen-US-u-va-posix"
|
||||||
|
)
|
||||||
|
|
||||||
|
func grandfathered(s [maxAltTaglen]byte) (t Tag, ok bool) {
|
||||||
|
if v, ok := grandfatheredMap[s]; ok {
|
||||||
|
if v < 0 {
|
||||||
|
return Make(altTags[altTagIndex[-v-1]:altTagIndex[-v]]), true
|
||||||
|
}
|
||||||
|
t.LangID = Language(v)
|
||||||
|
return t, true
|
||||||
|
}
|
||||||
|
return t, false
|
||||||
|
}
|
226
vendor/golang.org/x/text/internal/language/match.go
generated
vendored
Normal file
226
vendor/golang.org/x/text/internal/language/match.go
generated
vendored
Normal file
|
@ -0,0 +1,226 @@
|
||||||
|
// Copyright 2013 The Go Authors. All rights reserved.
|
||||||
|
// Use of this source code is governed by a BSD-style
|
||||||
|
// license that can be found in the LICENSE file.
|
||||||
|
|
||||||
|
package language
|
||||||
|
|
||||||
|
import "errors"
|
||||||
|
|
||||||
|
type scriptRegionFlags uint8
|
||||||
|
|
||||||
|
const (
|
||||||
|
isList = 1 << iota
|
||||||
|
scriptInFrom
|
||||||
|
regionInFrom
|
||||||
|
)
|
||||||
|
|
||||||
|
func (t *Tag) setUndefinedLang(id Language) {
|
||||||
|
if t.LangID == 0 {
|
||||||
|
t.LangID = id
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
func (t *Tag) setUndefinedScript(id Script) {
|
||||||
|
if t.ScriptID == 0 {
|
||||||
|
t.ScriptID = id
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
func (t *Tag) setUndefinedRegion(id Region) {
|
||||||
|
if t.RegionID == 0 || t.RegionID.Contains(id) {
|
||||||
|
t.RegionID = id
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// ErrMissingLikelyTagsData indicates no information was available
|
||||||
|
// to compute likely values of missing tags.
|
||||||
|
var ErrMissingLikelyTagsData = errors.New("missing likely tags data")
|
||||||
|
|
||||||
|
// addLikelySubtags sets subtags to their most likely value, given the locale.
|
||||||
|
// In most cases this means setting fields for unknown values, but in some
|
||||||
|
// cases it may alter a value. It returns an ErrMissingLikelyTagsData error
|
||||||
|
// if the given locale cannot be expanded.
|
||||||
|
func (t Tag) addLikelySubtags() (Tag, error) {
|
||||||
|
id, err := addTags(t)
|
||||||
|
if err != nil {
|
||||||
|
return t, err
|
||||||
|
} else if id.equalTags(t) {
|
||||||
|
return t, nil
|
||||||
|
}
|
||||||
|
id.RemakeString()
|
||||||
|
return id, nil
|
||||||
|
}
|
||||||
|
|
||||||
|
// specializeRegion attempts to specialize a group region.
|
||||||
|
func specializeRegion(t *Tag) bool {
|
||||||
|
if i := regionInclusion[t.RegionID]; i < nRegionGroups {
|
||||||
|
x := likelyRegionGroup[i]
|
||||||
|
if Language(x.lang) == t.LangID && Script(x.script) == t.ScriptID {
|
||||||
|
t.RegionID = Region(x.region)
|
||||||
|
}
|
||||||
|
return true
|
||||||
|
}
|
||||||
|
return false
|
||||||
|
}
|
||||||
|
|
||||||
|
// Maximize returns a new tag with missing tags filled in.
|
||||||
|
func (t Tag) Maximize() (Tag, error) {
|
||||||
|
return addTags(t)
|
||||||
|
}
|
||||||
|
|
||||||
|
func addTags(t Tag) (Tag, error) {
|
||||||
|
// We leave private use identifiers alone.
|
||||||
|
if t.IsPrivateUse() {
|
||||||
|
return t, nil
|
||||||
|
}
|
||||||
|
if t.ScriptID != 0 && t.RegionID != 0 {
|
||||||
|
if t.LangID != 0 {
|
||||||
|
// already fully specified
|
||||||
|
specializeRegion(&t)
|
||||||
|
return t, nil
|
||||||
|
}
|
||||||
|
// Search matches for und-script-region. Note that for these cases
|
||||||
|
// region will never be a group so there is no need to check for this.
|
||||||
|
list := likelyRegion[t.RegionID : t.RegionID+1]
|
||||||
|
if x := list[0]; x.flags&isList != 0 {
|
||||||
|
list = likelyRegionList[x.lang : x.lang+uint16(x.script)]
|
||||||
|
}
|
||||||
|
for _, x := range list {
|
||||||
|
// Deviating from the spec. See match_test.go for details.
|
||||||
|
if Script(x.script) == t.ScriptID {
|
||||||
|
t.setUndefinedLang(Language(x.lang))
|
||||||
|
return t, nil
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
if t.LangID != 0 {
|
||||||
|
// Search matches for lang-script and lang-region, where lang != und.
|
||||||
|
if t.LangID < langNoIndexOffset {
|
||||||
|
x := likelyLang[t.LangID]
|
||||||
|
if x.flags&isList != 0 {
|
||||||
|
list := likelyLangList[x.region : x.region+uint16(x.script)]
|
||||||
|
if t.ScriptID != 0 {
|
||||||
|
for _, x := range list {
|
||||||
|
if Script(x.script) == t.ScriptID && x.flags&scriptInFrom != 0 {
|
||||||
|
t.setUndefinedRegion(Region(x.region))
|
||||||
|
return t, nil
|
||||||
|
}
|
||||||
|
}
|
||||||
|
} else if t.RegionID != 0 {
|
||||||
|
count := 0
|
||||||
|
goodScript := true
|
||||||
|
tt := t
|
||||||
|
for _, x := range list {
|
||||||
|
// We visit all entries for which the script was not
|
||||||
|
// defined, including the ones where the region was not
|
||||||
|
// defined. This allows for proper disambiguation within
|
||||||
|
// regions.
|
||||||
|
if x.flags&scriptInFrom == 0 && t.RegionID.Contains(Region(x.region)) {
|
||||||
|
tt.RegionID = Region(x.region)
|
||||||
|
tt.setUndefinedScript(Script(x.script))
|
||||||
|
goodScript = goodScript && tt.ScriptID == Script(x.script)
|
||||||
|
count++
|
||||||
|
}
|
||||||
|
}
|
||||||
|
if count == 1 {
|
||||||
|
return tt, nil
|
||||||
|
}
|
||||||
|
// Even if we fail to find a unique Region, we might have
|
||||||
|
// an unambiguous script.
|
||||||
|
if goodScript {
|
||||||
|
t.ScriptID = tt.ScriptID
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
} else {
|
||||||
|
// Search matches for und-script.
|
||||||
|
if t.ScriptID != 0 {
|
||||||
|
x := likelyScript[t.ScriptID]
|
||||||
|
if x.region != 0 {
|
||||||
|
t.setUndefinedRegion(Region(x.region))
|
||||||
|
t.setUndefinedLang(Language(x.lang))
|
||||||
|
return t, nil
|
||||||
|
}
|
||||||
|
}
|
||||||
|
// Search matches for und-region. If und-script-region exists, it would
|
||||||
|
// have been found earlier.
|
||||||
|
if t.RegionID != 0 {
|
||||||
|
if i := regionInclusion[t.RegionID]; i < nRegionGroups {
|
||||||
|
x := likelyRegionGroup[i]
|
||||||
|
if x.region != 0 {
|
||||||
|
t.setUndefinedLang(Language(x.lang))
|
||||||
|
t.setUndefinedScript(Script(x.script))
|
||||||
|
t.RegionID = Region(x.region)
|
||||||
|
}
|
||||||
|
} else {
|
||||||
|
x := likelyRegion[t.RegionID]
|
||||||
|
if x.flags&isList != 0 {
|
||||||
|
x = likelyRegionList[x.lang]
|
||||||
|
}
|
||||||
|
if x.script != 0 && x.flags != scriptInFrom {
|
||||||
|
t.setUndefinedLang(Language(x.lang))
|
||||||
|
t.setUndefinedScript(Script(x.script))
|
||||||
|
return t, nil
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// Search matches for lang.
|
||||||
|
if t.LangID < langNoIndexOffset {
|
||||||
|
x := likelyLang[t.LangID]
|
||||||
|
if x.flags&isList != 0 {
|
||||||
|
x = likelyLangList[x.region]
|
||||||
|
}
|
||||||
|
if x.region != 0 {
|
||||||
|
t.setUndefinedScript(Script(x.script))
|
||||||
|
t.setUndefinedRegion(Region(x.region))
|
||||||
|
}
|
||||||
|
specializeRegion(&t)
|
||||||
|
if t.LangID == 0 {
|
||||||
|
t.LangID = _en // default language
|
||||||
|
}
|
||||||
|
return t, nil
|
||||||
|
}
|
||||||
|
return t, ErrMissingLikelyTagsData
|
||||||
|
}
|
||||||
|
|
||||||
|
func (t *Tag) setTagsFrom(id Tag) {
|
||||||
|
t.LangID = id.LangID
|
||||||
|
t.ScriptID = id.ScriptID
|
||||||
|
t.RegionID = id.RegionID
|
||||||
|
}
|
||||||
|
|
||||||
|
// minimize removes the region or script subtags from t such that
|
||||||
|
// t.addLikelySubtags() == t.minimize().addLikelySubtags().
|
||||||
|
func (t Tag) minimize() (Tag, error) {
|
||||||
|
t, err := minimizeTags(t)
|
||||||
|
if err != nil {
|
||||||
|
return t, err
|
||||||
|
}
|
||||||
|
t.RemakeString()
|
||||||
|
return t, nil
|
||||||
|
}
|
||||||
|
|
||||||
|
// minimizeTags mimics the behavior of the ICU 51 C implementation.
|
||||||
|
func minimizeTags(t Tag) (Tag, error) {
|
||||||
|
if t.equalTags(Und) {
|
||||||
|
return t, nil
|
||||||
|
}
|
||||||
|
max, err := addTags(t)
|
||||||
|
if err != nil {
|
||||||
|
return t, err
|
||||||
|
}
|
||||||
|
for _, id := range [...]Tag{
|
||||||
|
{LangID: t.LangID},
|
||||||
|
{LangID: t.LangID, RegionID: t.RegionID},
|
||||||
|
{LangID: t.LangID, ScriptID: t.ScriptID},
|
||||||
|
} {
|
||||||
|
if x, err := addTags(id); err == nil && max.equalTags(x) {
|
||||||
|
t.setTagsFrom(id)
|
||||||
|
break
|
||||||
|
}
|
||||||
|
}
|
||||||
|
return t, nil
|
||||||
|
}
|
594
vendor/golang.org/x/text/internal/language/parse.go
generated
vendored
Normal file
594
vendor/golang.org/x/text/internal/language/parse.go
generated
vendored
Normal file
|
@ -0,0 +1,594 @@
|
||||||
|
// Copyright 2013 The Go Authors. All rights reserved.
|
||||||
|
// Use of this source code is governed by a BSD-style
|
||||||
|
// license that can be found in the LICENSE file.
|
||||||
|
|
||||||
|
package language
|
||||||
|
|
||||||
|
import (
|
||||||
|
"bytes"
|
||||||
|
"errors"
|
||||||
|
"fmt"
|
||||||
|
"sort"
|
||||||
|
|
||||||
|
"golang.org/x/text/internal/tag"
|
||||||
|
)
|
||||||
|
|
||||||
|
// isAlpha returns true if the byte is not a digit.
|
||||||
|
// b must be an ASCII letter or digit.
|
||||||
|
func isAlpha(b byte) bool {
|
||||||
|
return b > '9'
|
||||||
|
}
|
||||||
|
|
||||||
|
// isAlphaNum returns true if the string contains only ASCII letters or digits.
|
||||||
|
func isAlphaNum(s []byte) bool {
|
||||||
|
for _, c := range s {
|
||||||
|
if !('a' <= c && c <= 'z' || 'A' <= c && c <= 'Z' || '0' <= c && c <= '9') {
|
||||||
|
return false
|
||||||
|
}
|
||||||
|
}
|
||||||
|
return true
|
||||||
|
}
|
||||||
|
|
||||||
|
// ErrSyntax is returned by any of the parsing functions when the
|
||||||
|
// input is not well-formed, according to BCP 47.
|
||||||
|
// TODO: return the position at which the syntax error occurred?
|
||||||
|
var ErrSyntax = errors.New("language: tag is not well-formed")
|
||||||
|
|
||||||
|
// ErrDuplicateKey is returned when a tag contains the same key twice with
|
||||||
|
// different values in the -u section.
|
||||||
|
var ErrDuplicateKey = errors.New("language: different values for same key in -u extension")
|
||||||
|
|
||||||
|
// ValueError is returned by any of the parsing functions when the
|
||||||
|
// input is well-formed but the respective subtag is not recognized
|
||||||
|
// as a valid value.
|
||||||
|
type ValueError struct {
|
||||||
|
v [8]byte
|
||||||
|
}
|
||||||
|
|
||||||
|
// NewValueError creates a new ValueError.
|
||||||
|
func NewValueError(tag []byte) ValueError {
|
||||||
|
var e ValueError
|
||||||
|
copy(e.v[:], tag)
|
||||||
|
return e
|
||||||
|
}
|
||||||
|
|
||||||
|
func (e ValueError) tag() []byte {
|
||||||
|
n := bytes.IndexByte(e.v[:], 0)
|
||||||
|
if n == -1 {
|
||||||
|
n = 8
|
||||||
|
}
|
||||||
|
return e.v[:n]
|
||||||
|
}
|
||||||
|
|
||||||
|
// Error implements the error interface.
|
||||||
|
func (e ValueError) Error() string {
|
||||||
|
return fmt.Sprintf("language: subtag %q is well-formed but unknown", e.tag())
|
||||||
|
}
|
||||||
|
|
||||||
|
// Subtag returns the subtag for which the error occurred.
|
||||||
|
func (e ValueError) Subtag() string {
|
||||||
|
return string(e.tag())
|
||||||
|
}
|
||||||
|
|
||||||
|
// scanner is used to scan BCP 47 tokens, which are separated by _ or -.
|
||||||
|
type scanner struct {
|
||||||
|
b []byte
|
||||||
|
bytes [max99thPercentileSize]byte
|
||||||
|
token []byte
|
||||||
|
start int // start position of the current token
|
||||||
|
end int // end position of the current token
|
||||||
|
next int // next point for scan
|
||||||
|
err error
|
||||||
|
done bool
|
||||||
|
}
|
||||||
|
|
||||||
|
func makeScannerString(s string) scanner {
|
||||||
|
scan := scanner{}
|
||||||
|
if len(s) <= len(scan.bytes) {
|
||||||
|
scan.b = scan.bytes[:copy(scan.bytes[:], s)]
|
||||||
|
} else {
|
||||||
|
scan.b = []byte(s)
|
||||||
|
}
|
||||||
|
scan.init()
|
||||||
|
return scan
|
||||||
|
}
|
||||||
|
|
||||||
|
// makeScanner returns a scanner using b as the input buffer.
|
||||||
|
// b is not copied and may be modified by the scanner routines.
|
||||||
|
func makeScanner(b []byte) scanner {
|
||||||
|
scan := scanner{b: b}
|
||||||
|
scan.init()
|
||||||
|
return scan
|
||||||
|
}
|
||||||
|
|
||||||
|
func (s *scanner) init() {
|
||||||
|
for i, c := range s.b {
|
||||||
|
if c == '_' {
|
||||||
|
s.b[i] = '-'
|
||||||
|
}
|
||||||
|
}
|
||||||
|
s.scan()
|
||||||
|
}
|
||||||
|
|
||||||
|
// restToLower converts the string between start and end to lower case.
|
||||||
|
func (s *scanner) toLower(start, end int) {
|
||||||
|
for i := start; i < end; i++ {
|
||||||
|
c := s.b[i]
|
||||||
|
if 'A' <= c && c <= 'Z' {
|
||||||
|
s.b[i] += 'a' - 'A'
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
func (s *scanner) setError(e error) {
|
||||||
|
if s.err == nil || (e == ErrSyntax && s.err != ErrSyntax) {
|
||||||
|
s.err = e
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// resizeRange shrinks or grows the array at position oldStart such that
|
||||||
|
// a new string of size newSize can fit between oldStart and oldEnd.
|
||||||
|
// Sets the scan point to after the resized range.
|
||||||
|
func (s *scanner) resizeRange(oldStart, oldEnd, newSize int) {
|
||||||
|
s.start = oldStart
|
||||||
|
if end := oldStart + newSize; end != oldEnd {
|
||||||
|
diff := end - oldEnd
|
||||||
|
if end < cap(s.b) {
|
||||||
|
b := make([]byte, len(s.b)+diff)
|
||||||
|
copy(b, s.b[:oldStart])
|
||||||
|
copy(b[end:], s.b[oldEnd:])
|
||||||
|
s.b = b
|
||||||
|
} else {
|
||||||
|
s.b = append(s.b[end:], s.b[oldEnd:]...)
|
||||||
|
}
|
||||||
|
s.next = end + (s.next - s.end)
|
||||||
|
s.end = end
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// replace replaces the current token with repl.
|
||||||
|
func (s *scanner) replace(repl string) {
|
||||||
|
s.resizeRange(s.start, s.end, len(repl))
|
||||||
|
copy(s.b[s.start:], repl)
|
||||||
|
}
|
||||||
|
|
||||||
|
// gobble removes the current token from the input.
|
||||||
|
// Caller must call scan after calling gobble.
|
||||||
|
func (s *scanner) gobble(e error) {
|
||||||
|
s.setError(e)
|
||||||
|
if s.start == 0 {
|
||||||
|
s.b = s.b[:+copy(s.b, s.b[s.next:])]
|
||||||
|
s.end = 0
|
||||||
|
} else {
|
||||||
|
s.b = s.b[:s.start-1+copy(s.b[s.start-1:], s.b[s.end:])]
|
||||||
|
s.end = s.start - 1
|
||||||
|
}
|
||||||
|
s.next = s.start
|
||||||
|
}
|
||||||
|
|
||||||
|
// deleteRange removes the given range from s.b before the current token.
|
||||||
|
func (s *scanner) deleteRange(start, end int) {
|
||||||
|
s.b = s.b[:start+copy(s.b[start:], s.b[end:])]
|
||||||
|
diff := end - start
|
||||||
|
s.next -= diff
|
||||||
|
s.start -= diff
|
||||||
|
s.end -= diff
|
||||||
|
}
|
||||||
|
|
||||||
|
// scan parses the next token of a BCP 47 string. Tokens that are larger
|
||||||
|
// than 8 characters or include non-alphanumeric characters result in an error
|
||||||
|
// and are gobbled and removed from the output.
|
||||||
|
// It returns the end position of the last token consumed.
|
||||||
|
func (s *scanner) scan() (end int) {
|
||||||
|
end = s.end
|
||||||
|
s.token = nil
|
||||||
|
for s.start = s.next; s.next < len(s.b); {
|
||||||
|
i := bytes.IndexByte(s.b[s.next:], '-')
|
||||||
|
if i == -1 {
|
||||||
|
s.end = len(s.b)
|
||||||
|
s.next = len(s.b)
|
||||||
|
i = s.end - s.start
|
||||||
|
} else {
|
||||||
|
s.end = s.next + i
|
||||||
|
s.next = s.end + 1
|
||||||
|
}
|
||||||
|
token := s.b[s.start:s.end]
|
||||||
|
if i < 1 || i > 8 || !isAlphaNum(token) {
|
||||||
|
s.gobble(ErrSyntax)
|
||||||
|
continue
|
||||||
|
}
|
||||||
|
s.token = token
|
||||||
|
return end
|
||||||
|
}
|
||||||
|
if n := len(s.b); n > 0 && s.b[n-1] == '-' {
|
||||||
|
s.setError(ErrSyntax)
|
||||||
|
s.b = s.b[:len(s.b)-1]
|
||||||
|
}
|
||||||
|
s.done = true
|
||||||
|
return end
|
||||||
|
}
|
||||||
|
|
||||||
|
// acceptMinSize parses multiple tokens of the given size or greater.
|
||||||
|
// It returns the end position of the last token consumed.
|
||||||
|
func (s *scanner) acceptMinSize(min int) (end int) {
|
||||||
|
end = s.end
|
||||||
|
s.scan()
|
||||||
|
for ; len(s.token) >= min; s.scan() {
|
||||||
|
end = s.end
|
||||||
|
}
|
||||||
|
return end
|
||||||
|
}
|
||||||
|
|
||||||
|
// Parse parses the given BCP 47 string and returns a valid Tag. If parsing
|
||||||
|
// failed it returns an error and any part of the tag that could be parsed.
|
||||||
|
// If parsing succeeded but an unknown value was found, it returns
|
||||||
|
// ValueError. The Tag returned in this case is just stripped of the unknown
|
||||||
|
// value. All other values are preserved. It accepts tags in the BCP 47 format
|
||||||
|
// and extensions to this standard defined in
|
||||||
|
// https://www.unicode.org/reports/tr35/#Unicode_Language_and_Locale_Identifiers.
|
||||||
|
func Parse(s string) (t Tag, err error) {
|
||||||
|
// TODO: consider supporting old-style locale key-value pairs.
|
||||||
|
if s == "" {
|
||||||
|
return Und, ErrSyntax
|
||||||
|
}
|
||||||
|
if len(s) <= maxAltTaglen {
|
||||||
|
b := [maxAltTaglen]byte{}
|
||||||
|
for i, c := range s {
|
||||||
|
// Generating invalid UTF-8 is okay as it won't match.
|
||||||
|
if 'A' <= c && c <= 'Z' {
|
||||||
|
c += 'a' - 'A'
|
||||||
|
} else if c == '_' {
|
||||||
|
c = '-'
|
||||||
|
}
|
||||||
|
b[i] = byte(c)
|
||||||
|
}
|
||||||
|
if t, ok := grandfathered(b); ok {
|
||||||
|
return t, nil
|
||||||
|
}
|
||||||
|
}
|
||||||
|
scan := makeScannerString(s)
|
||||||
|
return parse(&scan, s)
|
||||||
|
}
|
||||||
|
|
||||||
|
func parse(scan *scanner, s string) (t Tag, err error) {
|
||||||
|
t = Und
|
||||||
|
var end int
|
||||||
|
if n := len(scan.token); n <= 1 {
|
||||||
|
scan.toLower(0, len(scan.b))
|
||||||
|
if n == 0 || scan.token[0] != 'x' {
|
||||||
|
return t, ErrSyntax
|
||||||
|
}
|
||||||
|
end = parseExtensions(scan)
|
||||||
|
} else if n >= 4 {
|
||||||
|
return Und, ErrSyntax
|
||||||
|
} else { // the usual case
|
||||||
|
t, end = parseTag(scan)
|
||||||
|
if n := len(scan.token); n == 1 {
|
||||||
|
t.pExt = uint16(end)
|
||||||
|
end = parseExtensions(scan)
|
||||||
|
} else if end < len(scan.b) {
|
||||||
|
scan.setError(ErrSyntax)
|
||||||
|
scan.b = scan.b[:end]
|
||||||
|
}
|
||||||
|
}
|
||||||
|
if int(t.pVariant) < len(scan.b) {
|
||||||
|
if end < len(s) {
|
||||||
|
s = s[:end]
|
||||||
|
}
|
||||||
|
if len(s) > 0 && tag.Compare(s, scan.b) == 0 {
|
||||||
|
t.str = s
|
||||||
|
} else {
|
||||||
|
t.str = string(scan.b)
|
||||||
|
}
|
||||||
|
} else {
|
||||||
|
t.pVariant, t.pExt = 0, 0
|
||||||
|
}
|
||||||
|
return t, scan.err
|
||||||
|
}
|
||||||
|
|
||||||
|
// parseTag parses language, script, region and variants.
|
||||||
|
// It returns a Tag and the end position in the input that was parsed.
|
||||||
|
func parseTag(scan *scanner) (t Tag, end int) {
|
||||||
|
var e error
|
||||||
|
// TODO: set an error if an unknown lang, script or region is encountered.
|
||||||
|
t.LangID, e = getLangID(scan.token)
|
||||||
|
scan.setError(e)
|
||||||
|
scan.replace(t.LangID.String())
|
||||||
|
langStart := scan.start
|
||||||
|
end = scan.scan()
|
||||||
|
for len(scan.token) == 3 && isAlpha(scan.token[0]) {
|
||||||
|
// From http://tools.ietf.org/html/bcp47, <lang>-<extlang> tags are equivalent
|
||||||
|
// to a tag of the form <extlang>.
|
||||||
|
lang, e := getLangID(scan.token)
|
||||||
|
if lang != 0 {
|
||||||
|
t.LangID = lang
|
||||||
|
copy(scan.b[langStart:], lang.String())
|
||||||
|
scan.b[langStart+3] = '-'
|
||||||
|
scan.start = langStart + 4
|
||||||
|
}
|
||||||
|
scan.gobble(e)
|
||||||
|
end = scan.scan()
|
||||||
|
}
|
||||||
|
if len(scan.token) == 4 && isAlpha(scan.token[0]) {
|
||||||
|
t.ScriptID, e = getScriptID(script, scan.token)
|
||||||
|
if t.ScriptID == 0 {
|
||||||
|
scan.gobble(e)
|
||||||
|
}
|
||||||
|
end = scan.scan()
|
||||||
|
}
|
||||||
|
if n := len(scan.token); n >= 2 && n <= 3 {
|
||||||
|
t.RegionID, e = getRegionID(scan.token)
|
||||||
|
if t.RegionID == 0 {
|
||||||
|
scan.gobble(e)
|
||||||
|
} else {
|
||||||
|
scan.replace(t.RegionID.String())
|
||||||
|
}
|
||||||
|
end = scan.scan()
|
||||||
|
}
|
||||||
|
scan.toLower(scan.start, len(scan.b))
|
||||||
|
t.pVariant = byte(end)
|
||||||
|
end = parseVariants(scan, end, t)
|
||||||
|
t.pExt = uint16(end)
|
||||||
|
return t, end
|
||||||
|
}
|
||||||
|
|
||||||
|
var separator = []byte{'-'}
|
||||||
|
|
||||||
|
// parseVariants scans tokens as long as each token is a valid variant string.
|
||||||
|
// Duplicate variants are removed.
|
||||||
|
func parseVariants(scan *scanner, end int, t Tag) int {
|
||||||
|
start := scan.start
|
||||||
|
varIDBuf := [4]uint8{}
|
||||||
|
variantBuf := [4][]byte{}
|
||||||
|
varID := varIDBuf[:0]
|
||||||
|
variant := variantBuf[:0]
|
||||||
|
last := -1
|
||||||
|
needSort := false
|
||||||
|
for ; len(scan.token) >= 4; scan.scan() {
|
||||||
|
// TODO: measure the impact of needing this conversion and redesign
|
||||||
|
// the data structure if there is an issue.
|
||||||
|
v, ok := variantIndex[string(scan.token)]
|
||||||
|
if !ok {
|
||||||
|
// unknown variant
|
||||||
|
// TODO: allow user-defined variants?
|
||||||
|
scan.gobble(NewValueError(scan.token))
|
||||||
|
continue
|
||||||
|
}
|
||||||
|
varID = append(varID, v)
|
||||||
|
variant = append(variant, scan.token)
|
||||||
|
if !needSort {
|
||||||
|
if last < int(v) {
|
||||||
|
last = int(v)
|
||||||
|
} else {
|
||||||
|
needSort = true
|
||||||
|
// There is no legal combinations of more than 7 variants
|
||||||
|
// (and this is by no means a useful sequence).
|
||||||
|
const maxVariants = 8
|
||||||
|
if len(varID) > maxVariants {
|
||||||
|
break
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
end = scan.end
|
||||||
|
}
|
||||||
|
if needSort {
|
||||||
|
sort.Sort(variantsSort{varID, variant})
|
||||||
|
k, l := 0, -1
|
||||||
|
for i, v := range varID {
|
||||||
|
w := int(v)
|
||||||
|
if l == w {
|
||||||
|
// Remove duplicates.
|
||||||
|
continue
|
||||||
|
}
|
||||||
|
varID[k] = varID[i]
|
||||||
|
variant[k] = variant[i]
|
||||||
|
k++
|
||||||
|
l = w
|
||||||
|
}
|
||||||
|
if str := bytes.Join(variant[:k], separator); len(str) == 0 {
|
||||||
|
end = start - 1
|
||||||
|
} else {
|
||||||
|
scan.resizeRange(start, end, len(str))
|
||||||
|
copy(scan.b[scan.start:], str)
|
||||||
|
end = scan.end
|
||||||
|
}
|
||||||
|
}
|
||||||
|
return end
|
||||||
|
}
|
||||||
|
|
||||||
|
type variantsSort struct {
|
||||||
|
i []uint8
|
||||||
|
v [][]byte
|
||||||
|
}
|
||||||
|
|
||||||
|
func (s variantsSort) Len() int {
|
||||||
|
return len(s.i)
|
||||||
|
}
|
||||||
|
|
||||||
|
func (s variantsSort) Swap(i, j int) {
|
||||||
|
s.i[i], s.i[j] = s.i[j], s.i[i]
|
||||||
|
s.v[i], s.v[j] = s.v[j], s.v[i]
|
||||||
|
}
|
||||||
|
|
||||||
|
func (s variantsSort) Less(i, j int) bool {
|
||||||
|
return s.i[i] < s.i[j]
|
||||||
|
}
|
||||||
|
|
||||||
|
type bytesSort struct {
|
||||||
|
b [][]byte
|
||||||
|
n int // first n bytes to compare
|
||||||
|
}
|
||||||
|
|
||||||
|
func (b bytesSort) Len() int {
|
||||||
|
return len(b.b)
|
||||||
|
}
|
||||||
|
|
||||||
|
func (b bytesSort) Swap(i, j int) {
|
||||||
|
b.b[i], b.b[j] = b.b[j], b.b[i]
|
||||||
|
}
|
||||||
|
|
||||||
|
func (b bytesSort) Less(i, j int) bool {
|
||||||
|
for k := 0; k < b.n; k++ {
|
||||||
|
if b.b[i][k] == b.b[j][k] {
|
||||||
|
continue
|
||||||
|
}
|
||||||
|
return b.b[i][k] < b.b[j][k]
|
||||||
|
}
|
||||||
|
return false
|
||||||
|
}
|
||||||
|
|
||||||
|
// parseExtensions parses and normalizes the extensions in the buffer.
|
||||||
|
// It returns the last position of scan.b that is part of any extension.
|
||||||
|
// It also trims scan.b to remove excess parts accordingly.
|
||||||
|
func parseExtensions(scan *scanner) int {
|
||||||
|
start := scan.start
|
||||||
|
exts := [][]byte{}
|
||||||
|
private := []byte{}
|
||||||
|
end := scan.end
|
||||||
|
for len(scan.token) == 1 {
|
||||||
|
extStart := scan.start
|
||||||
|
ext := scan.token[0]
|
||||||
|
end = parseExtension(scan)
|
||||||
|
extension := scan.b[extStart:end]
|
||||||
|
if len(extension) < 3 || (ext != 'x' && len(extension) < 4) {
|
||||||
|
scan.setError(ErrSyntax)
|
||||||
|
end = extStart
|
||||||
|
continue
|
||||||
|
} else if start == extStart && (ext == 'x' || scan.start == len(scan.b)) {
|
||||||
|
scan.b = scan.b[:end]
|
||||||
|
return end
|
||||||
|
} else if ext == 'x' {
|
||||||
|
private = extension
|
||||||
|
break
|
||||||
|
}
|
||||||
|
exts = append(exts, extension)
|
||||||
|
}
|
||||||
|
sort.Sort(bytesSort{exts, 1})
|
||||||
|
if len(private) > 0 {
|
||||||
|
exts = append(exts, private)
|
||||||
|
}
|
||||||
|
scan.b = scan.b[:start]
|
||||||
|
if len(exts) > 0 {
|
||||||
|
scan.b = append(scan.b, bytes.Join(exts, separator)...)
|
||||||
|
} else if start > 0 {
|
||||||
|
// Strip trailing '-'.
|
||||||
|
scan.b = scan.b[:start-1]
|
||||||
|
}
|
||||||
|
return end
|
||||||
|
}
|
||||||
|
|
||||||
|
// parseExtension parses a single extension and returns the position of
|
||||||
|
// the extension end.
|
||||||
|
func parseExtension(scan *scanner) int {
|
||||||
|
start, end := scan.start, scan.end
|
||||||
|
switch scan.token[0] {
|
||||||
|
case 'u':
|
||||||
|
attrStart := end
|
||||||
|
scan.scan()
|
||||||
|
for last := []byte{}; len(scan.token) > 2; scan.scan() {
|
||||||
|
if bytes.Compare(scan.token, last) != -1 {
|
||||||
|
// Attributes are unsorted. Start over from scratch.
|
||||||
|
p := attrStart + 1
|
||||||
|
scan.next = p
|
||||||
|
attrs := [][]byte{}
|
||||||
|
for scan.scan(); len(scan.token) > 2; scan.scan() {
|
||||||
|
attrs = append(attrs, scan.token)
|
||||||
|
end = scan.end
|
||||||
|
}
|
||||||
|
sort.Sort(bytesSort{attrs, 3})
|
||||||
|
copy(scan.b[p:], bytes.Join(attrs, separator))
|
||||||
|
break
|
||||||
|
}
|
||||||
|
last = scan.token
|
||||||
|
end = scan.end
|
||||||
|
}
|
||||||
|
var last, key []byte
|
||||||
|
for attrEnd := end; len(scan.token) == 2; last = key {
|
||||||
|
key = scan.token
|
||||||
|
keyEnd := scan.end
|
||||||
|
end = scan.acceptMinSize(3)
|
||||||
|
// TODO: check key value validity
|
||||||
|
if keyEnd == end || bytes.Compare(key, last) != 1 {
|
||||||
|
// We have an invalid key or the keys are not sorted.
|
||||||
|
// Start scanning keys from scratch and reorder.
|
||||||
|
p := attrEnd + 1
|
||||||
|
scan.next = p
|
||||||
|
keys := [][]byte{}
|
||||||
|
for scan.scan(); len(scan.token) == 2; {
|
||||||
|
keyStart, keyEnd := scan.start, scan.end
|
||||||
|
end = scan.acceptMinSize(3)
|
||||||
|
if keyEnd != end {
|
||||||
|
keys = append(keys, scan.b[keyStart:end])
|
||||||
|
} else {
|
||||||
|
scan.setError(ErrSyntax)
|
||||||
|
end = keyStart
|
||||||
|
}
|
||||||
|
}
|
||||||
|
sort.Stable(bytesSort{keys, 2})
|
||||||
|
if n := len(keys); n > 0 {
|
||||||
|
k := 0
|
||||||
|
for i := 1; i < n; i++ {
|
||||||
|
if !bytes.Equal(keys[k][:2], keys[i][:2]) {
|
||||||
|
k++
|
||||||
|
keys[k] = keys[i]
|
||||||
|
} else if !bytes.Equal(keys[k], keys[i]) {
|
||||||
|
scan.setError(ErrDuplicateKey)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
keys = keys[:k+1]
|
||||||
|
}
|
||||||
|
reordered := bytes.Join(keys, separator)
|
||||||
|
if e := p + len(reordered); e < end {
|
||||||
|
scan.deleteRange(e, end)
|
||||||
|
end = e
|
||||||
|
}
|
||||||
|
copy(scan.b[p:], reordered)
|
||||||
|
break
|
||||||
|
}
|
||||||
|
}
|
||||||
|
case 't':
|
||||||
|
scan.scan()
|
||||||
|
if n := len(scan.token); n >= 2 && n <= 3 && isAlpha(scan.token[1]) {
|
||||||
|
_, end = parseTag(scan)
|
||||||
|
scan.toLower(start, end)
|
||||||
|
}
|
||||||
|
for len(scan.token) == 2 && !isAlpha(scan.token[1]) {
|
||||||
|
end = scan.acceptMinSize(3)
|
||||||
|
}
|
||||||
|
case 'x':
|
||||||
|
end = scan.acceptMinSize(1)
|
||||||
|
default:
|
||||||
|
end = scan.acceptMinSize(2)
|
||||||
|
}
|
||||||
|
return end
|
||||||
|
}
|
||||||
|
|
||||||
|
// getExtension returns the name, body and end position of the extension.
|
||||||
|
func getExtension(s string, p int) (end int, ext string) {
|
||||||
|
if s[p] == '-' {
|
||||||
|
p++
|
||||||
|
}
|
||||||
|
if s[p] == 'x' {
|
||||||
|
return len(s), s[p:]
|
||||||
|
}
|
||||||
|
end = nextExtension(s, p)
|
||||||
|
return end, s[p:end]
|
||||||
|
}
|
||||||
|
|
||||||
|
// nextExtension finds the next extension within the string, searching
|
||||||
|
// for the -<char>- pattern from position p.
|
||||||
|
// In the fast majority of cases, language tags will have at most
|
||||||
|
// one extension and extensions tend to be small.
|
||||||
|
func nextExtension(s string, p int) int {
|
||||||
|
for n := len(s) - 3; p < n; {
|
||||||
|
if s[p] == '-' {
|
||||||
|
if s[p+2] == '-' {
|
||||||
|
return p
|
||||||
|
}
|
||||||
|
p += 3
|
||||||
|
} else {
|
||||||
|
p++
|
||||||
|
}
|
||||||
|
}
|
||||||
|
return len(s)
|
||||||
|
}
|
3431
vendor/golang.org/x/text/internal/language/tables.go
generated
vendored
Normal file
3431
vendor/golang.org/x/text/internal/language/tables.go
generated
vendored
Normal file
File diff suppressed because it is too large
Load diff
48
vendor/golang.org/x/text/internal/language/tags.go
generated
vendored
Normal file
48
vendor/golang.org/x/text/internal/language/tags.go
generated
vendored
Normal file
|
@ -0,0 +1,48 @@
|
||||||
|
// Copyright 2013 The Go Authors. All rights reserved.
|
||||||
|
// Use of this source code is governed by a BSD-style
|
||||||
|
// license that can be found in the LICENSE file.
|
||||||
|
|
||||||
|
package language
|
||||||
|
|
||||||
|
// MustParse is like Parse, but panics if the given BCP 47 tag cannot be parsed.
|
||||||
|
// It simplifies safe initialization of Tag values.
|
||||||
|
func MustParse(s string) Tag {
|
||||||
|
t, err := Parse(s)
|
||||||
|
if err != nil {
|
||||||
|
panic(err)
|
||||||
|
}
|
||||||
|
return t
|
||||||
|
}
|
||||||
|
|
||||||
|
// MustParseBase is like ParseBase, but panics if the given base cannot be parsed.
|
||||||
|
// It simplifies safe initialization of Base values.
|
||||||
|
func MustParseBase(s string) Language {
|
||||||
|
b, err := ParseBase(s)
|
||||||
|
if err != nil {
|
||||||
|
panic(err)
|
||||||
|
}
|
||||||
|
return b
|
||||||
|
}
|
||||||
|
|
||||||
|
// MustParseScript is like ParseScript, but panics if the given script cannot be
|
||||||
|
// parsed. It simplifies safe initialization of Script values.
|
||||||
|
func MustParseScript(s string) Script {
|
||||||
|
scr, err := ParseScript(s)
|
||||||
|
if err != nil {
|
||||||
|
panic(err)
|
||||||
|
}
|
||||||
|
return scr
|
||||||
|
}
|
||||||
|
|
||||||
|
// MustParseRegion is like ParseRegion, but panics if the given region cannot be
|
||||||
|
// parsed. It simplifies safe initialization of Region values.
|
||||||
|
func MustParseRegion(s string) Region {
|
||||||
|
r, err := ParseRegion(s)
|
||||||
|
if err != nil {
|
||||||
|
panic(err)
|
||||||
|
}
|
||||||
|
return r
|
||||||
|
}
|
||||||
|
|
||||||
|
// Und is the root language.
|
||||||
|
var Und Tag
|
100
vendor/golang.org/x/text/internal/tag/tag.go
generated
vendored
Normal file
100
vendor/golang.org/x/text/internal/tag/tag.go
generated
vendored
Normal file
|
@ -0,0 +1,100 @@
|
||||||
|
// Copyright 2015 The Go Authors. All rights reserved.
|
||||||
|
// Use of this source code is governed by a BSD-style
|
||||||
|
// license that can be found in the LICENSE file.
|
||||||
|
|
||||||
|
// Package tag contains functionality handling tags and related data.
|
||||||
|
package tag // import "golang.org/x/text/internal/tag"
|
||||||
|
|
||||||
|
import "sort"
|
||||||
|
|
||||||
|
// An Index converts tags to a compact numeric value.
|
||||||
|
//
|
||||||
|
// All elements are of size 4. Tags may be up to 4 bytes long. Excess bytes can
|
||||||
|
// be used to store additional information about the tag.
|
||||||
|
type Index string
|
||||||
|
|
||||||
|
// Elem returns the element data at the given index.
|
||||||
|
func (s Index) Elem(x int) string {
|
||||||
|
return string(s[x*4 : x*4+4])
|
||||||
|
}
|
||||||
|
|
||||||
|
// Index reports the index of the given key or -1 if it could not be found.
|
||||||
|
// Only the first len(key) bytes from the start of the 4-byte entries will be
|
||||||
|
// considered for the search and the first match in Index will be returned.
|
||||||
|
func (s Index) Index(key []byte) int {
|
||||||
|
n := len(key)
|
||||||
|
// search the index of the first entry with an equal or higher value than
|
||||||
|
// key in s.
|
||||||
|
index := sort.Search(len(s)/4, func(i int) bool {
|
||||||
|
return cmp(s[i*4:i*4+n], key) != -1
|
||||||
|
})
|
||||||
|
i := index * 4
|
||||||
|
if cmp(s[i:i+len(key)], key) != 0 {
|
||||||
|
return -1
|
||||||
|
}
|
||||||
|
return index
|
||||||
|
}
|
||||||
|
|
||||||
|
// Next finds the next occurrence of key after index x, which must have been
|
||||||
|
// obtained from a call to Index using the same key. It returns x+1 or -1.
|
||||||
|
func (s Index) Next(key []byte, x int) int {
|
||||||
|
if x++; x*4 < len(s) && cmp(s[x*4:x*4+len(key)], key) == 0 {
|
||||||
|
return x
|
||||||
|
}
|
||||||
|
return -1
|
||||||
|
}
|
||||||
|
|
||||||
|
// cmp returns an integer comparing a and b lexicographically.
|
||||||
|
func cmp(a Index, b []byte) int {
|
||||||
|
n := len(a)
|
||||||
|
if len(b) < n {
|
||||||
|
n = len(b)
|
||||||
|
}
|
||||||
|
for i, c := range b[:n] {
|
||||||
|
switch {
|
||||||
|
case a[i] > c:
|
||||||
|
return 1
|
||||||
|
case a[i] < c:
|
||||||
|
return -1
|
||||||
|
}
|
||||||
|
}
|
||||||
|
switch {
|
||||||
|
case len(a) < len(b):
|
||||||
|
return -1
|
||||||
|
case len(a) > len(b):
|
||||||
|
return 1
|
||||||
|
}
|
||||||
|
return 0
|
||||||
|
}
|
||||||
|
|
||||||
|
// Compare returns an integer comparing a and b lexicographically.
|
||||||
|
func Compare(a string, b []byte) int {
|
||||||
|
return cmp(Index(a), b)
|
||||||
|
}
|
||||||
|
|
||||||
|
// FixCase reformats b to the same pattern of cases as form.
|
||||||
|
// If returns false if string b is malformed.
|
||||||
|
func FixCase(form string, b []byte) bool {
|
||||||
|
if len(form) != len(b) {
|
||||||
|
return false
|
||||||
|
}
|
||||||
|
for i, c := range b {
|
||||||
|
if form[i] <= 'Z' {
|
||||||
|
if c >= 'a' {
|
||||||
|
c -= 'z' - 'Z'
|
||||||
|
}
|
||||||
|
if c < 'A' || 'Z' < c {
|
||||||
|
return false
|
||||||
|
}
|
||||||
|
} else {
|
||||||
|
if c <= 'Z' {
|
||||||
|
c += 'z' - 'Z'
|
||||||
|
}
|
||||||
|
if c < 'a' || 'z' < c {
|
||||||
|
return false
|
||||||
|
}
|
||||||
|
}
|
||||||
|
b[i] = c
|
||||||
|
}
|
||||||
|
return true
|
||||||
|
}
|
187
vendor/golang.org/x/text/language/coverage.go
generated
vendored
Normal file
187
vendor/golang.org/x/text/language/coverage.go
generated
vendored
Normal file
|
@ -0,0 +1,187 @@
|
||||||
|
// Copyright 2014 The Go Authors. All rights reserved.
|
||||||
|
// Use of this source code is governed by a BSD-style
|
||||||
|
// license that can be found in the LICENSE file.
|
||||||
|
|
||||||
|
package language
|
||||||
|
|
||||||
|
import (
|
||||||
|
"fmt"
|
||||||
|
"sort"
|
||||||
|
|
||||||
|
"golang.org/x/text/internal/language"
|
||||||
|
)
|
||||||
|
|
||||||
|
// The Coverage interface is used to define the level of coverage of an
|
||||||
|
// internationalization service. Note that not all types are supported by all
|
||||||
|
// services. As lists may be generated on the fly, it is recommended that users
|
||||||
|
// of a Coverage cache the results.
|
||||||
|
type Coverage interface {
|
||||||
|
// Tags returns the list of supported tags.
|
||||||
|
Tags() []Tag
|
||||||
|
|
||||||
|
// BaseLanguages returns the list of supported base languages.
|
||||||
|
BaseLanguages() []Base
|
||||||
|
|
||||||
|
// Scripts returns the list of supported scripts.
|
||||||
|
Scripts() []Script
|
||||||
|
|
||||||
|
// Regions returns the list of supported regions.
|
||||||
|
Regions() []Region
|
||||||
|
}
|
||||||
|
|
||||||
|
var (
|
||||||
|
// Supported defines a Coverage that lists all supported subtags. Tags
|
||||||
|
// always returns nil.
|
||||||
|
Supported Coverage = allSubtags{}
|
||||||
|
)
|
||||||
|
|
||||||
|
// TODO:
|
||||||
|
// - Support Variants, numbering systems.
|
||||||
|
// - CLDR coverage levels.
|
||||||
|
// - Set of common tags defined in this package.
|
||||||
|
|
||||||
|
type allSubtags struct{}
|
||||||
|
|
||||||
|
// Regions returns the list of supported regions. As all regions are in a
|
||||||
|
// consecutive range, it simply returns a slice of numbers in increasing order.
|
||||||
|
// The "undefined" region is not returned.
|
||||||
|
func (s allSubtags) Regions() []Region {
|
||||||
|
reg := make([]Region, language.NumRegions)
|
||||||
|
for i := range reg {
|
||||||
|
reg[i] = Region{language.Region(i + 1)}
|
||||||
|
}
|
||||||
|
return reg
|
||||||
|
}
|
||||||
|
|
||||||
|
// Scripts returns the list of supported scripts. As all scripts are in a
|
||||||
|
// consecutive range, it simply returns a slice of numbers in increasing order.
|
||||||
|
// The "undefined" script is not returned.
|
||||||
|
func (s allSubtags) Scripts() []Script {
|
||||||
|
scr := make([]Script, language.NumScripts)
|
||||||
|
for i := range scr {
|
||||||
|
scr[i] = Script{language.Script(i + 1)}
|
||||||
|
}
|
||||||
|
return scr
|
||||||
|
}
|
||||||
|
|
||||||
|
// BaseLanguages returns the list of all supported base languages. It generates
|
||||||
|
// the list by traversing the internal structures.
|
||||||
|
func (s allSubtags) BaseLanguages() []Base {
|
||||||
|
bs := language.BaseLanguages()
|
||||||
|
base := make([]Base, len(bs))
|
||||||
|
for i, b := range bs {
|
||||||
|
base[i] = Base{b}
|
||||||
|
}
|
||||||
|
return base
|
||||||
|
}
|
||||||
|
|
||||||
|
// Tags always returns nil.
|
||||||
|
func (s allSubtags) Tags() []Tag {
|
||||||
|
return nil
|
||||||
|
}
|
||||||
|
|
||||||
|
// coverage is used by NewCoverage which is used as a convenient way for
|
||||||
|
// creating Coverage implementations for partially defined data. Very often a
|
||||||
|
// package will only need to define a subset of slices. coverage provides a
|
||||||
|
// convenient way to do this. Moreover, packages using NewCoverage, instead of
|
||||||
|
// their own implementation, will not break if later new slice types are added.
|
||||||
|
type coverage struct {
|
||||||
|
tags func() []Tag
|
||||||
|
bases func() []Base
|
||||||
|
scripts func() []Script
|
||||||
|
regions func() []Region
|
||||||
|
}
|
||||||
|
|
||||||
|
func (s *coverage) Tags() []Tag {
|
||||||
|
if s.tags == nil {
|
||||||
|
return nil
|
||||||
|
}
|
||||||
|
return s.tags()
|
||||||
|
}
|
||||||
|
|
||||||
|
// bases implements sort.Interface and is used to sort base languages.
|
||||||
|
type bases []Base
|
||||||
|
|
||||||
|
func (b bases) Len() int {
|
||||||
|
return len(b)
|
||||||
|
}
|
||||||
|
|
||||||
|
func (b bases) Swap(i, j int) {
|
||||||
|
b[i], b[j] = b[j], b[i]
|
||||||
|
}
|
||||||
|
|
||||||
|
func (b bases) Less(i, j int) bool {
|
||||||
|
return b[i].langID < b[j].langID
|
||||||
|
}
|
||||||
|
|
||||||
|
// BaseLanguages returns the result from calling s.bases if it is specified or
|
||||||
|
// otherwise derives the set of supported base languages from tags.
|
||||||
|
func (s *coverage) BaseLanguages() []Base {
|
||||||
|
if s.bases == nil {
|
||||||
|
tags := s.Tags()
|
||||||
|
if len(tags) == 0 {
|
||||||
|
return nil
|
||||||
|
}
|
||||||
|
a := make([]Base, len(tags))
|
||||||
|
for i, t := range tags {
|
||||||
|
a[i] = Base{language.Language(t.lang())}
|
||||||
|
}
|
||||||
|
sort.Sort(bases(a))
|
||||||
|
k := 0
|
||||||
|
for i := 1; i < len(a); i++ {
|
||||||
|
if a[k] != a[i] {
|
||||||
|
k++
|
||||||
|
a[k] = a[i]
|
||||||
|
}
|
||||||
|
}
|
||||||
|
return a[:k+1]
|
||||||
|
}
|
||||||
|
return s.bases()
|
||||||
|
}
|
||||||
|
|
||||||
|
func (s *coverage) Scripts() []Script {
|
||||||
|
if s.scripts == nil {
|
||||||
|
return nil
|
||||||
|
}
|
||||||
|
return s.scripts()
|
||||||
|
}
|
||||||
|
|
||||||
|
func (s *coverage) Regions() []Region {
|
||||||
|
if s.regions == nil {
|
||||||
|
return nil
|
||||||
|
}
|
||||||
|
return s.regions()
|
||||||
|
}
|
||||||
|
|
||||||
|
// NewCoverage returns a Coverage for the given lists. It is typically used by
|
||||||
|
// packages providing internationalization services to define their level of
|
||||||
|
// coverage. A list may be of type []T or func() []T, where T is either Tag,
|
||||||
|
// Base, Script or Region. The returned Coverage derives the value for Bases
|
||||||
|
// from Tags if no func or slice for []Base is specified. For other unspecified
|
||||||
|
// types the returned Coverage will return nil for the respective methods.
|
||||||
|
func NewCoverage(list ...interface{}) Coverage {
|
||||||
|
s := &coverage{}
|
||||||
|
for _, x := range list {
|
||||||
|
switch v := x.(type) {
|
||||||
|
case func() []Base:
|
||||||
|
s.bases = v
|
||||||
|
case func() []Script:
|
||||||
|
s.scripts = v
|
||||||
|
case func() []Region:
|
||||||
|
s.regions = v
|
||||||
|
case func() []Tag:
|
||||||
|
s.tags = v
|
||||||
|
case []Base:
|
||||||
|
s.bases = func() []Base { return v }
|
||||||
|
case []Script:
|
||||||
|
s.scripts = func() []Script { return v }
|
||||||
|
case []Region:
|
||||||
|
s.regions = func() []Region { return v }
|
||||||
|
case []Tag:
|
||||||
|
s.tags = func() []Tag { return v }
|
||||||
|
default:
|
||||||
|
panic(fmt.Sprintf("language: unsupported set type %T", v))
|
||||||
|
}
|
||||||
|
}
|
||||||
|
return s
|
||||||
|
}
|
102
vendor/golang.org/x/text/language/doc.go
generated
vendored
Normal file
102
vendor/golang.org/x/text/language/doc.go
generated
vendored
Normal file
|
@ -0,0 +1,102 @@
|
||||||
|
// Copyright 2017 The Go Authors. All rights reserved.
|
||||||
|
// Use of this source code is governed by a BSD-style
|
||||||
|
// license that can be found in the LICENSE file.
|
||||||
|
|
||||||
|
// Package language implements BCP 47 language tags and related functionality.
|
||||||
|
//
|
||||||
|
// The most important function of package language is to match a list of
|
||||||
|
// user-preferred languages to a list of supported languages.
|
||||||
|
// It alleviates the developer of dealing with the complexity of this process
|
||||||
|
// and provides the user with the best experience
|
||||||
|
// (see https://blog.golang.org/matchlang).
|
||||||
|
//
|
||||||
|
//
|
||||||
|
// Matching preferred against supported languages
|
||||||
|
//
|
||||||
|
// A Matcher for an application that supports English, Australian English,
|
||||||
|
// Danish, and standard Mandarin can be created as follows:
|
||||||
|
//
|
||||||
|
// var matcher = language.NewMatcher([]language.Tag{
|
||||||
|
// language.English, // The first language is used as fallback.
|
||||||
|
// language.MustParse("en-AU"),
|
||||||
|
// language.Danish,
|
||||||
|
// language.Chinese,
|
||||||
|
// })
|
||||||
|
//
|
||||||
|
// This list of supported languages is typically implied by the languages for
|
||||||
|
// which there exists translations of the user interface.
|
||||||
|
//
|
||||||
|
// User-preferred languages usually come as a comma-separated list of BCP 47
|
||||||
|
// language tags.
|
||||||
|
// The MatchString finds best matches for such strings:
|
||||||
|
//
|
||||||
|
// handler(w http.ResponseWriter, r *http.Request) {
|
||||||
|
// lang, _ := r.Cookie("lang")
|
||||||
|
// accept := r.Header.Get("Accept-Language")
|
||||||
|
// tag, _ := language.MatchStrings(matcher, lang.String(), accept)
|
||||||
|
//
|
||||||
|
// // tag should now be used for the initialization of any
|
||||||
|
// // locale-specific service.
|
||||||
|
// }
|
||||||
|
//
|
||||||
|
// The Matcher's Match method can be used to match Tags directly.
|
||||||
|
//
|
||||||
|
// Matchers are aware of the intricacies of equivalence between languages, such
|
||||||
|
// as deprecated subtags, legacy tags, macro languages, mutual
|
||||||
|
// intelligibility between scripts and languages, and transparently passing
|
||||||
|
// BCP 47 user configuration.
|
||||||
|
// For instance, it will know that a reader of Bokmål Danish can read Norwegian
|
||||||
|
// and will know that Cantonese ("yue") is a good match for "zh-HK".
|
||||||
|
//
|
||||||
|
//
|
||||||
|
// Using match results
|
||||||
|
//
|
||||||
|
// To guarantee a consistent user experience to the user it is important to
|
||||||
|
// use the same language tag for the selection of any locale-specific services.
|
||||||
|
// For example, it is utterly confusing to substitute spelled-out numbers
|
||||||
|
// or dates in one language in text of another language.
|
||||||
|
// More subtly confusing is using the wrong sorting order or casing
|
||||||
|
// algorithm for a certain language.
|
||||||
|
//
|
||||||
|
// All the packages in x/text that provide locale-specific services
|
||||||
|
// (e.g. collate, cases) should be initialized with the tag that was
|
||||||
|
// obtained at the start of an interaction with the user.
|
||||||
|
//
|
||||||
|
// Note that Tag that is returned by Match and MatchString may differ from any
|
||||||
|
// of the supported languages, as it may contain carried over settings from
|
||||||
|
// the user tags.
|
||||||
|
// This may be inconvenient when your application has some additional
|
||||||
|
// locale-specific data for your supported languages.
|
||||||
|
// Match and MatchString both return the index of the matched supported tag
|
||||||
|
// to simplify associating such data with the matched tag.
|
||||||
|
//
|
||||||
|
//
|
||||||
|
// Canonicalization
|
||||||
|
//
|
||||||
|
// If one uses the Matcher to compare languages one does not need to
|
||||||
|
// worry about canonicalization.
|
||||||
|
//
|
||||||
|
// The meaning of a Tag varies per application. The language package
|
||||||
|
// therefore delays canonicalization and preserves information as much
|
||||||
|
// as possible. The Matcher, however, will always take into account that
|
||||||
|
// two different tags may represent the same language.
|
||||||
|
//
|
||||||
|
// By default, only legacy and deprecated tags are converted into their
|
||||||
|
// canonical equivalent. All other information is preserved. This approach makes
|
||||||
|
// the confidence scores more accurate and allows matchers to distinguish
|
||||||
|
// between variants that are otherwise lost.
|
||||||
|
//
|
||||||
|
// As a consequence, two tags that should be treated as identical according to
|
||||||
|
// BCP 47 or CLDR, like "en-Latn" and "en", will be represented differently. The
|
||||||
|
// Matcher handles such distinctions, though, and is aware of the
|
||||||
|
// equivalence relations. The CanonType type can be used to alter the
|
||||||
|
// canonicalization form.
|
||||||
|
//
|
||||||
|
// References
|
||||||
|
//
|
||||||
|
// BCP 47 - Tags for Identifying Languages http://tools.ietf.org/html/bcp47
|
||||||
|
//
|
||||||
|
package language // import "golang.org/x/text/language"
|
||||||
|
|
||||||
|
// TODO: explanation on how to match languages for your own locale-specific
|
||||||
|
// service.
|
38
vendor/golang.org/x/text/language/go1_1.go
generated
vendored
Normal file
38
vendor/golang.org/x/text/language/go1_1.go
generated
vendored
Normal file
|
@ -0,0 +1,38 @@
|
||||||
|
// Copyright 2013 The Go Authors. All rights reserved.
|
||||||
|
// Use of this source code is governed by a BSD-style
|
||||||
|
// license that can be found in the LICENSE file.
|
||||||
|
|
||||||
|
// +build !go1.2
|
||||||
|
|
||||||
|
package language
|
||||||
|
|
||||||
|
import "sort"
|
||||||
|
|
||||||
|
func sortStable(s sort.Interface) {
|
||||||
|
ss := stableSort{
|
||||||
|
s: s,
|
||||||
|
pos: make([]int, s.Len()),
|
||||||
|
}
|
||||||
|
for i := range ss.pos {
|
||||||
|
ss.pos[i] = i
|
||||||
|
}
|
||||||
|
sort.Sort(&ss)
|
||||||
|
}
|
||||||
|
|
||||||
|
type stableSort struct {
|
||||||
|
s sort.Interface
|
||||||
|
pos []int
|
||||||
|
}
|
||||||
|
|
||||||
|
func (s *stableSort) Len() int {
|
||||||
|
return len(s.pos)
|
||||||
|
}
|
||||||
|
|
||||||
|
func (s *stableSort) Less(i, j int) bool {
|
||||||
|
return s.s.Less(i, j) || !s.s.Less(j, i) && s.pos[i] < s.pos[j]
|
||||||
|
}
|
||||||
|
|
||||||
|
func (s *stableSort) Swap(i, j int) {
|
||||||
|
s.s.Swap(i, j)
|
||||||
|
s.pos[i], s.pos[j] = s.pos[j], s.pos[i]
|
||||||
|
}
|
11
vendor/golang.org/x/text/language/go1_2.go
generated
vendored
Normal file
11
vendor/golang.org/x/text/language/go1_2.go
generated
vendored
Normal file
|
@ -0,0 +1,11 @@
|
||||||
|
// Copyright 2013 The Go Authors. All rights reserved.
|
||||||
|
// Use of this source code is governed by a BSD-style
|
||||||
|
// license that can be found in the LICENSE file.
|
||||||
|
|
||||||
|
// +build go1.2
|
||||||
|
|
||||||
|
package language
|
||||||
|
|
||||||
|
import "sort"
|
||||||
|
|
||||||
|
var sortStable = sort.Stable
|
Some files were not shown because too many files have changed in this diff Show more
Loading…
Reference in a new issue