first commit

This commit is contained in:
Izalia Mae 2022-02-19 16:22:58 -05:00
commit bdd5395a4a
19 changed files with 2739 additions and 0 deletions

121
.gitignore vendored Normal file
View file

@ -0,0 +1,121 @@
# ---> Python
# Byte-compiled / optimized / DLL files
__pycache__/
*.py[cod]
*$py.class
# C extensions
*.so
# Distribution / packaging
.Python
build/
develop-eggs/
dist/
downloads/
eggs/
.eggs/
lib/
lib64/
parts/
sdist/
var/
wheels/
*.egg-info/
.installed.cfg
*.egg
MANIFEST
# PyInstaller
# Usually these files are written by a python script from a template
# before PyInstaller builds the exe, so as to inject date/other infos into it.
*.manifest
*.spec
# Installer logs
pip-log.txt
pip-delete-this-directory.txt
# Unit test / coverage reports
htmlcov/
.tox/
.nox/
.coverage
.coverage.*
.cache
nosetests.xml
coverage.xml
*.cover
.hypothesis/
.pytest_cache/
# Translations
*.mo
*.pot
# Django stuff:
*.log
local_settings.py
db.sqlite3
# Flask stuff:
instance/
.webassets-cache
# Scrapy stuff:
.scrapy
# Sphinx documentation
docs/_build/
# PyBuilder
target/
# Jupyter Notebook
.ipynb_checkpoints
# IPython
profile_default/
ipython_config.py
# pyenv
.python-version
# celery beat schedule file
celerybeat-schedule
# SageMath parsed files
*.sage.py
# Environments
.env
.venv
env/
venv/
ENV/
env.bak/
venv.bak/
# Spyder project settings
.spyderproject
.spyproject
# Rope project settings
.ropeproject
# mkdocs documentation
/site
# mypy
.mypy_cache/
.dmypy.json
dmypy.json
# Pyre type checker
.pyre/
# Symlink to izzylib
izzylib
test*.py
reload.cfg

524
LICENSE Normal file
View file

@ -0,0 +1,524 @@
IzzyLib
Copyright Zoey Mae 2021
COOPERATIVE NON-VIOLENT PUBLIC LICENSE v6
Preamble
The Cooperative Non-Violent Public license is a freedom-respecting sharealike
license for both the author of a work as well as those subject to a work.
It aims to protect the basic rights of human beings from exploitation,
the earth from plunder, and the equal treatment of the workers involved in the
creation of the work. It aims to ensure a copyrighted work is forever
available for public use, modification, and redistribution under the same
terms so long as the work is not used for harm. For more information about
the CNPL refer to the official webpage
Official Webpage: https://thufie.lain.haus/NPL.html
Terms and Conditions
THE WORK (AS DEFINED BELOW) IS PROVIDED UNDER THE TERMS OF THIS
COOPERATIVE NON-VIOLENT PUBLIC LICENSE v5 ("LICENSE"). THE WORK IS
PROTECTED BY COPYRIGHT AND ALL OTHER APPLICABLE LAWS. ANY USE OF THE
WORK OTHER THAN AS AUTHORIZED UNDER THIS LICENSE OR COPYRIGHT LAW IS
PROHIBITED. BY EXERCISING ANY RIGHTS TO THE WORK PROVIDED IN THIS
LICENSE, YOU AGREE TO BE BOUND BY THE TERMS OF THIS LICENSE.
TO THE EXTENT THIS LICENSE MAY BE CONSIDERED TO BE A CONTRACT,
THE LICENSOR GRANTS YOU THE RIGHTS CONTAINED HERE IN AS CONSIDERATION
FOR ACCEPTING THE TERMS AND CONDITIONS OF THIS LICENSE AND FOR AGREEING
TO BE BOUND BY THE TERMS AND CONDITIONS OF THIS LICENSE.
1. DEFINITIONS
a. "Act of War" means any action of one country against any group
either with an intention to provoke a conflict or an action that
occurs during a declared war or during armed conflict between
military forces of any origin. This includes but is not limited
to enforcing sanctions or sieges, supplying armed forces,
or profiting from the manufacture of tools or weaponry used in
military conflict.
b. "Adaptation" means a work based upon the Work, or upon the
Work and other pre-existing works, such as a translation,
adaptation, derivative work, arrangement of music or other
alterations of a literary or artistic work, or phonogram or
performance and includes cinematographic adaptations or any
other form in which the Work may be recast, transformed, or
adapted including in any form recognizably derived from the
original, except that a work that constitutes a Collection will
not be considered an Adaptation for the purpose of this License.
For the avoidance of doubt, where the Work is a musical work,
performance or phonogram, the synchronization of the Work in
timed-relation with a moving image ("synching") will be
considered an Adaptation for the purpose of this License. In
addition, where the Work is designed to output a neural network
the output of the neural network will be considered an
Adaptation for the purpose of this license.
c. "Bodily Harm" means any physical hurt or injury to a person that
interferes with the health or comfort of the person and that is more
than merely transient or trifling in nature.
d. "Collection" means a collection of literary or artistic
works, such as encyclopedias and anthologies, or performances,
phonograms or broadcasts, or other works or subject matter other
than works listed in Section 1(i) below, which, by reason of the
selection and arrangement of their contents, constitute
intellectual creations, in which the Work is included in its
entirety in unmodified form along with one or more other
contributions, each constituting separate and independent works
in themselves, which together are assembled into a collective
whole. A work that constitutes a Collection will not be
considered an Adaptation (as defined above) for the purposes of
this License.
e. "Distribute" means to make available to the public the
original and copies of the Work or Adaptation, as appropriate,
through sale, gift or any other transfer of possession or
ownership.
f. "Incarceration" means confinement in a jail, prison, or
any other place where individuals of any kind are held against
either their will or (if their will cannot be determined) the
will of their legal guardian or guardians. In the case of a
conflict between the will of the individual and the will of
their legal guardian or guardians, the will of the
individual will take precedence.
g. "Licensor" means the individual, individuals, entity or
entities that offer(s) the Work under the terms of this License.
h. "Original Author" means, in the case of a literary or
artistic work, the individual, individuals, entity or entities
who created the Work or if no individual or entity can be
identified, the publisher; and in addition (i) in the case of a
performance the actors, singers, musicians, dancers, and other
persons who act, sing, deliver, declaim, play in, interpret or
otherwise perform literary or artistic works or expressions of
folklore; (ii) in the case of a phonogram the producer being the
person or legal entity who first fixes the sounds of a
performance or other sounds; and, (iii) in the case of
broadcasts, the organization that transmits the broadcast.
i. "Work" means the literary and/or artistic work offered under
the terms of this License including without limitation any
production in the literary, scientific and artistic domain,
whatever may be the mode or form of its expression including
digital form, such as a book, pamphlet and other writing; a
lecture, address, sermon or other work of the same nature; a
dramatic or dramatico-musical work; a choreographic work or
entertainment in dumb show; a musical composition with or
without words; a cinematographic work to which are assimilated
works expressed by a process analogous to cinematography; a work
of drawing, painting, architecture, sculpture, engraving or
lithography; a photographic work to which are assimilated works
expressed by a process analogous to photography; a work of
applied art; an illustration, map, plan, sketch or
three-dimensional work relative to geography, topography,
architecture or science; a performance; a broadcast; a
phonogram; a compilation of data to the extent it is protected
as a copyrightable work; or a work performed by a variety or
circus performer to the extent it is not otherwise considered a
literary or artistic work.
j. "You" means an individual or entity exercising rights under
this License who has not previously violated the terms of this
License with respect to the Work, or who has received express
permission from the Licensor to exercise rights under this
License despite a previous violation.
k. "Publicly Perform" means to perform public recitations of the
Work and to communicate to the public those public recitations,
by any means or process, including by wire or wireless means or
public digital performances; to make available to the public
Works in such a way that members of the public may access these
Works from a place and at a place individually chosen by them;
to perform the Work to the public by any means or process and
the communication to the public of the performances of the Work,
including by public digital performance; to broadcast and
rebroadcast the Work by any means including signs, sounds or
images.
l. "Reproduce" means to make copies of the Work by any means
including without limitation by sound or visual recordings and
the right of fixation and reproducing fixations of the Work,
including storage of a protected performance or phonogram in
digital form or other electronic medium.
m. "Software" means any digital Work which, through use of a
third-party piece of Software or through the direct usage of
itself on a computer system, the memory of the computer is
modified dynamically or semi-dynamically. "Software",
secondly, processes or interprets information.
n. "Source Code" means the human-readable form of Software
through which the Original Author and/or Distributor originally
created, derived, and/or modified it.
o. "Surveilling" means the use of the Work to either
overtly or covertly observe and record persons and or their
activities.
p. "Network Service" means the use of a piece of Software to
interpret or modify information that is subsequently and directly
served to users over the Internet.
q. "Discriminate" means the use of a work to differentiate between
humans in a such a way which prioritizes some above others on the
basis of percieved membership within certain groups.
r. "Hate Speech" means communication or any form
of expression which is solely for the purpose of expressing hatred
for some group or advocating a form of Discrimination
(to Discriminate per definition in (q)) between humans.
s. "Coercion" means leveraging of the threat of force or use of force
to intimidate a person in order to gain compliance, or to offer
large incentives which aim to entice a person to act against their
will.
2. FAIR DEALING RIGHTS
Nothing in this License is intended to reduce, limit, or restrict any
uses free from copyright or rights arising from limitations or
exceptions that are provided for in connection with the copyright
protection under copyright law or other applicable laws.
3. LICENSE GRANT
Subject to the terms and conditions of this License, Licensor hereby
grants You a worldwide, royalty-free, non-exclusive, perpetual (for the
duration of the applicable copyright) license to exercise the rights in
the Work as stated below:
a. to Reproduce the Work, to incorporate the Work into one or
more Collections, and to Reproduce the Work as incorporated in
the Collections;
b. to create and Reproduce Adaptations provided that any such
Adaptation, including any translation in any medium, takes
reasonable steps to clearly label, demarcate or otherwise
identify that changes were made to the original Work. For
example, a translation could be marked "The original work was
translated from English to Spanish," or a modification could
indicate "The original work has been modified.";
c. to Distribute and Publicly Perform the Work including as
incorporated in Collections; and,
d. to Distribute and Publicly Perform Adaptations. The above
rights may be exercised in all media and formats whether now
known or hereafter devised. The above rights include the right
to make such modifications as are technically necessary to
exercise the rights in other media and formats. Subject to
Section 8(g), all rights not expressly granted by Licensor are
hereby reserved, including but not limited to the rights set
forth in Section 4(i).
4. RESTRICTIONS
The license granted in Section 3 above is expressly made subject to and
limited by the following restrictions:
a. You may Distribute or Publicly Perform the Work only under
the terms of this License. You must include a copy of, or the
Uniform Resource Identifier (URI) for, this License with every
copy of the Work You Distribute or Publicly Perform. You may not
offer or impose any terms on the Work that restrict the terms of
this License or the ability of the recipient of the Work to
exercise the rights granted to that recipient under the terms of
the License. You may not sublicense the Work. You must keep
intact all notices that refer to this License and to the
disclaimer of warranties with every copy of the Work You
Distribute or Publicly Perform. When You Distribute or Publicly
Perform the Work, You may not impose any effective technological
measures on the Work that restrict the ability of a recipient of
the Work from You to exercise the rights granted to that
recipient under the terms of the License. This Section 4(a)
applies to the Work as incorporated in a Collection, but this
does not require the Collection apart from the Work itself to be
made subject to the terms of this License. If You create a
Collection, upon notice from any Licensor You must, to the
extent practicable, remove from the Collection any credit as
required by Section 4(h), as requested. If You create an
Adaptation, upon notice from any Licensor You must, to the
extent practicable, remove from the Adaptation any credit as
required by Section 4(h), as requested.
b. Subject to the exception in Section 4(e), you may not
exercise any of the rights granted to You in Section 3 above in
any manner that is primarily intended for or directed toward
commercial advantage or private monetary compensation. The
exchange of the Work for other copyrighted works by means of
digital file-sharing or otherwise shall not be considered to be
intended for or directed toward commercial advantage or private
monetary compensation, provided there is no payment of any
monetary compensation in connection with the exchange of
copyrighted works.
c. If the Work meets the definition of Software, You may exercise
the rights granted in Section 3 only if You provide a copy of the
corresponding Source Code from which the Work was derived in digital
form, or You provide a URI for the corresponding Source Code of
the Work, to any recipients upon request.
d. If the Work is used as or for a Network Service, You may exercise
the rights granted in Section 3 only if You provide a copy of the
corresponding Source Code from which the Work was derived in digital
form, or You provide a URI for the corresponding Source Code to the
Work, to any recipients of the data served or modified by the Web
Service.
e. You may exercise the rights granted in Section 3 for
commercial purposes only if:
i. You are a worker-owned business or worker-owned
collective; and
ii. after tax, all financial gain, surplus, profits and
benefits produced by the business or collective are
distributed among the worker-owners unless a set amount
is to be allocated towards community projects as decided
by a previously-established consensus agreement between the
worker-owners where all worker-owners agreed
iii. You are not using such rights on behalf of a business
other than those specified in 4(e.i) and elaborated upon in
4(e.ii), nor are using such rights as a proxy on behalf of a
business with the intent to circumvent the aforementioned
restrictions on such a business.
f. Any use by a business that is privately owned and managed,
and that seeks to generate profit from the labor of employees
paid by salary or other wages, is not permitted under this
license.
g. You may exercise the rights granted in Section 3 for
any purposes only if:
i. You do not use the Work for the purpose of inflicting
Bodily Harm on human beings (subject to criminal
prosecution or otherwise) outside of providing medical aid
or undergoing a voluntary procedure under no form of
Coercion.
ii.You do not use the Work for the purpose of Surveilling
or tracking individuals for financial gain.
iii. You do not use the Work in an Act of War.
iv. You do not use the Work for the purpose of supporting
or profiting from an Act of War.
v. You do not use the Work for the purpose of Incarceration.
vi. You do not use the Work for the purpose of extracting,
processing, or refining, oil, gas, or coal. Or to in any other
way to deliberately pollute the environment as a byproduct
of manufacturing or irresponsible disposal of hazardous materials.
vii. You do not use the Work for the purpose of
expediting, coordinating, or facilitating paid work
undertaken by individuals under the age of 12 years.
viii. You do not use the Work to either Discriminate or
spread Hate Speech on the basis of sex, sexual orientation,
gender identity, race, age, disability, color, national origin,
religion, or lower economic status.
h. If You Distribute, or Publicly Perform the Work or any
Adaptations or Collections, You must, unless a request has been
made pursuant to Section 4(a), keep intact all copyright notices
for the Work and provide, reasonable to the medium or means You
are utilizing: (i) the name of the Original Author (or
pseudonym, if applicable) if supplied, and/or if the Original
Author and/or Licensor designate another party or parties (e.g.,
a sponsor institute, publishing entity, journal) for attribution
("Attribution Parties") in Licensor's copyright notice, terms of
service or by other reasonable means, the name of such party or
parties; (ii) the title of the Work if supplied; (iii) to the
extent reasonably practicable, the URI, if any, that Licensor
to be associated with the Work, unless such URI does
not refer to the copyright notice or licensing information for
the Work; and, (iv) consistent with Section 3(b), in the case of
an Adaptation, a credit identifying the use of the Work in the
Adaptation (e.g., "French translation of the Work by Original
Author," or "Screenplay based on original Work by Original
Author"). The credit required by this Section 4(h) may be
implemented in any reasonable manner; provided, however, that in
the case of an Adaptation or Collection, at a minimum such credit
will appear, if a credit for all contributing authors of the
Adaptation or Collection appears, then as part of these credits
and in a manner at least as prominent as the credits for the
other contributing authors. For the avoidance of doubt, You may
only use the credit required by this Section for the purpose of
attribution in the manner set out above and, by exercising Your
rights under this License, You may not implicitly or explicitly
assert or imply any connection with, sponsorship or endorsement
by the Original Author, Licensor and/or Attribution Parties, as
appropriate, of You or Your use of the Work, without the
separate, express prior written permission of the Original
Author, Licensor and/or Attribution Parties.
i. For the avoidance of doubt:
i. Non-waivable Compulsory License Schemes. In those
jurisdictions in which the right to collect royalties
through any statutory or compulsory licensing scheme
cannot be waived, the Licensor reserves the exclusive
right to collect such royalties for any exercise by You of
the rights granted under this License;
ii. Waivable Compulsory License Schemes. In those
jurisdictions in which the right to collect royalties
through any statutory or compulsory licensing scheme can
be waived, the Licensor reserves the exclusive right to
collect such royalties for any exercise by You of the
rights granted under this License if Your exercise of such
rights is for a purpose or use which is otherwise than
noncommercial as permitted under Section 4(b) and
otherwise waives the right to collect royalties through
any statutory or compulsory licensing scheme; and,
iii.Voluntary License Schemes. The Licensor reserves the
right to collect royalties, whether individually or, in
the event that the Licensor is a member of a collecting
society that administers voluntary licensing schemes, via
that society, from any exercise by You of the rights
granted under this License that is for a purpose or use
which is otherwise than noncommercial as permitted under
Section 4(b).
j. Except as otherwise agreed in writing by the Licensor or as
may be otherwise permitted by applicable law, if You Reproduce,
Distribute or Publicly Perform the Work either by itself or as
part of any Adaptations or Collections, You must not distort,
mutilate, modify or take other derogatory action in relation to
the Work which would be prejudicial to the Original Author's
honor or reputation. Licensor agrees that in those jurisdictions
(e.g. Japan), in which any exercise of the right granted in
Section 3(b) of this License (the right to make Adaptations)
would be deemed to be a distortion, mutilation, modification or
other derogatory action prejudicial to the Original Author's
honor and reputation, the Licensor will waive or not assert, as
appropriate, this Section, to the fullest extent permitted by
the applicable national law, to enable You to reasonably
exercise Your right under Section 3(b) of this License (right to
make Adaptations) but not otherwise.
k. Do not make any legal claim against anyone accusing the
Work, with or without changes, alone or with other works,
of infringing any patent claim.
5. REPRESENTATIONS, WARRANTIES AND DISCLAIMER
UNLESS OTHERWISE MUTUALLY AGREED TO BY THE PARTIES IN WRITING, LICENSOR
OFFERS THE WORK AS-IS AND MAKES NO REPRESENTATIONS OR WARRANTIES OF ANY
KIND CONCERNING THE WORK, EXPRESS, IMPLIED, STATUTORY OR OTHERWISE,
INCLUDING, WITHOUT LIMITATION, WARRANTIES OF TITLE, MERCHANTIBILITY,
FITNESS FOR A PARTICULAR PURPOSE, NONINFRINGEMENT, OR THE ABSENCE OF
LATENT OR OTHER DEFECTS, ACCURACY, OR THE PRESENCE OF ABSENCE OF
ERRORS, WHETHER OR NOT DISCOVERABLE. SOME JURISDICTIONS DO NOT ALLOW
THE EXCLUSION OF IMPLIED WARRANTIES, SO SUCH EXCLUSION MAY NOT APPLY TO
YOU.
6. LIMITATION ON LIABILITY
EXCEPT TO THE EXTENT REQUIRED BY APPLICABLE LAW, IN NO EVENT WILL
LICENSOR BE LIABLE TO YOU ON ANY LEGAL THEORY FOR ANY SPECIAL,
INCIDENTAL, CONSEQUENTIAL, PUNITIVE OR EXEMPLARY DAMAGES ARISING OUT OF
THIS LICENSE OR THE USE OF THE WORK, EVEN IF LICENSOR HAS BEEN ADVISED
OF THE POSSIBILITY OF SUCH DAMAGES.
7. TERMINATION
a. This License and the rights granted hereunder will terminate
automatically upon any breach by You of the terms of this
License. Individuals or entities who have received Adaptations
or Collections from You under this License, however, will not
have their licenses terminated provided such individuals or
entities remain in full compliance with those licenses. Sections
1, 2, 5, 6, 7, and 8 will survive any termination of this
License.
b. Subject to the above terms and conditions, the license
granted here is perpetual (for the duration of the applicable
copyright in the Work). Notwithstanding the above, Licensor
reserves the right to release the Work under different license
terms or to stop distributing the Work at any time; provided,
however that any such election will not serve to withdraw this
License (or any other license that has been, or is required to
be, granted under the terms of this License), and this License
will continue in full force and effect unless terminated as
stated above.
8. REVISED LICENSE VERSIONS
a. This License may receive future revisions in the original
spirit of the license intended to strengthen This License.
Each version of This License has an incrementing version number.
b. Unless otherwise specified like in Section 8(c) The Licensor
has only granted this current version of This License for The Work.
In this case future revisions do not apply.
c. The Licensor may specify that the latest available
revision of This License be used for The Work by either explicitly
writing so or by suffixing the License URI with a "+" symbol.
d. The Licensor may specify that The Work is also available
under the terms of This License's current revision as well
as specific future revisions. The Licensor may do this by
writing it explicitly or suffixing the License URI with any
additional version numbers each separated by a comma.
9. MISCELLANEOUS
a. Each time You Distribute or Publicly Perform the Work or a
Collection, the Licensor offers to the recipient a license to
the Work on the same terms and conditions as the license granted
to You under this License.
b. Each time You Distribute or Publicly Perform an Adaptation,
Licensor offers to the recipient a license to the original Work
on the same terms and conditions as the license granted to You
under this License.
c. If the Work is classified as Software, each time You Distribute
or Publicly Perform an Adaptation, Licensor offers to the recipient
a copy and/or URI of the corresponding Source Code on the same
terms and conditions as the license granted to You under this License.
d. If the Work is used as a Network Service, each time You Distribute
or Publicly Perform an Adaptation, or serve data derived from the
Software, the Licensor offers to any recipients of the data a copy
and/or URI of the corresponding Source Code on the same terms and
conditions as the license granted to You under this License.
e. If any provision of this License is invalid or unenforceable
under applicable law, it shall not affect the validity or
enforceability of the remainder of the terms of this License,
and without further action by the parties to this agreement,
such provision shall be reformed to the minimum extent necessary
to make such provision valid and enforceable.
f. No term or provision of this License shall be deemed waived
and no breach consented to unless such waiver or consent shall
be in writing and signed by the party to be charged with such
waiver or consent.
g. This License constitutes the entire agreement between the
parties with respect to the Work licensed here. There are no
understandings, agreements or representations with respect to
the Work not specified here. Licensor shall not be bound by any
additional provisions that may appear in any communication from
You. This License may not be modified without the mutual written
agreement of the Licensor and You.
h. The rights granted under, and the subject matter referenced,
in this License were drafted utilizing the terminology of the
Berne Convention for the Protection of Literary and Artistic
Works (as amended on September 28, 1979), the Rome Convention of
1961, the WIPO Copyright Treaty of 1996, the WIPO Performances
and Phonograms Treaty of 1996 and the Universal Copyright
Convention (as revised on July 24, 1971). These rights and
subject matter take effect in the relevant jurisdiction in which
the License terms are sought to be enforced according to the
corresponding provisions of the implementation of those treaty
provisions in the applicable national law. If the standard suite
of rights granted under applicable copyright law includes
additional rights not granted under this License, such
additional rights are deemed to be included in the License; this
License is not intended to restrict the license of any rights
under applicable law.

1
MANIFEST.in Normal file
View file

@ -0,0 +1 @@
recursive-include barkshark_http/frontend *

20
Pipfile Normal file
View file

@ -0,0 +1,20 @@
[[source]]
url = "https://pypi.org/simple"
verify_ssl = true
name = "pypi"
[packages]
colour = "*"
hamlish-jinja = "*"
http-router = "*"
python-magic = "*"
pycryptodome = "*"
tldextract = "*"
jinja2 = "*"
argon2-cffi = "*"
markdown = "*"
[dev-packages]
[requires]
python_version = "3.9"

351
Pipfile.lock generated Normal file
View file

@ -0,0 +1,351 @@
{
"_meta": {
"hash": {
"sha256": "b556a2c8ff27882c23a6c2eb27b3f661501055d8cafc22e88db1508e6af072a2"
},
"pipfile-spec": 6,
"requires": {
"python_version": "3.9"
},
"sources": [
{
"name": "pypi",
"url": "https://pypi.org/simple",
"verify_ssl": true
}
]
},
"default": {
"argon2-cffi": {
"hashes": [
"sha256:8c976986f2c5c0e5000919e6de187906cfd81fb1c72bf9d88c01177e77da7f80",
"sha256:d384164d944190a7dd7ef22c6aa3ff197da12962bd04b17f64d4e93d934dba5b"
],
"index": "pypi",
"version": "==21.3.0"
},
"argon2-cffi-bindings": {
"hashes": [
"sha256:20ef543a89dee4db46a1a6e206cd015360e5a75822f76df533845c3cbaf72670",
"sha256:2c3e3cc67fdb7d82c4718f19b4e7a87123caf8a93fde7e23cf66ac0337d3cb3f",
"sha256:3b9ef65804859d335dc6b31582cad2c5166f0c3e7975f324d9ffaa34ee7e6583",
"sha256:3e385d1c39c520c08b53d63300c3ecc28622f076f4c2b0e6d7e796e9f6502194",
"sha256:58ed19212051f49a523abb1dbe954337dc82d947fb6e5a0da60f7c8471a8476c",
"sha256:5e00316dabdaea0b2dd82d141cc66889ced0cdcbfa599e8b471cf22c620c329a",
"sha256:603ca0aba86b1349b147cab91ae970c63118a0f30444d4bc80355937c950c082",
"sha256:6a22ad9800121b71099d0fb0a65323810a15f2e292f2ba450810a7316e128ee5",
"sha256:8cd69c07dd875537a824deec19f978e0f2078fdda07fd5c42ac29668dda5f40f",
"sha256:93f9bf70084f97245ba10ee36575f0c3f1e7d7724d67d8e5b08e61787c320ed7",
"sha256:9524464572e12979364b7d600abf96181d3541da11e23ddf565a32e70bd4dc0d",
"sha256:b2ef1c30440dbbcba7a5dc3e319408b59676e2e039e2ae11a8775ecf482b192f",
"sha256:b746dba803a79238e925d9046a63aa26bf86ab2a2fe74ce6b009a1c3f5c8f2ae",
"sha256:bb89ceffa6c791807d1305ceb77dbfacc5aa499891d2c55661c6459651fc39e3",
"sha256:bd46088725ef7f58b5a1ef7ca06647ebaf0eb4baff7d1d0d177c6cc8744abd86",
"sha256:ccb949252cb2ab3a08c02024acb77cfb179492d5701c7cbdbfd776124d4d2367",
"sha256:d4966ef5848d820776f5f562a7d45fdd70c2f330c961d0d745b784034bd9f48d",
"sha256:e415e3f62c8d124ee16018e491a009937f8cf7ebf5eb430ffc5de21b900dad93",
"sha256:ed2937d286e2ad0cc79a7087d3c272832865f779430e0cc2b4f3718d3159b0cb",
"sha256:f1152ac548bd5b8bcecfb0b0371f082037e47128653df2e8ba6e914d384f3c3e",
"sha256:f9f8b450ed0547e3d473fdc8612083fd08dd2120d6ac8f73828df9b7d45bb351"
],
"markers": "python_version >= '3.6'",
"version": "==21.2.0"
},
"certifi": {
"hashes": [
"sha256:78884e7c1d4b00ce3cea67b44566851c4343c120abd683433ce934a68ea58872",
"sha256:d62a0163eb4c2344ac042ab2bdf75399a71a2d8c7d47eac2e2ee91b9d6339569"
],
"version": "==2021.10.8"
},
"cffi": {
"hashes": [
"sha256:00c878c90cb53ccfaae6b8bc18ad05d2036553e6d9d1d9dbcf323bbe83854ca3",
"sha256:0104fb5ae2391d46a4cb082abdd5c69ea4eab79d8d44eaaf79f1b1fd806ee4c2",
"sha256:06c48159c1abed75c2e721b1715c379fa3200c7784271b3c46df01383b593636",
"sha256:0808014eb713677ec1292301ea4c81ad277b6cdf2fdd90fd540af98c0b101d20",
"sha256:10dffb601ccfb65262a27233ac273d552ddc4d8ae1bf93b21c94b8511bffe728",
"sha256:14cd121ea63ecdae71efa69c15c5543a4b5fbcd0bbe2aad864baca0063cecf27",
"sha256:17771976e82e9f94976180f76468546834d22a7cc404b17c22df2a2c81db0c66",
"sha256:181dee03b1170ff1969489acf1c26533710231c58f95534e3edac87fff06c443",
"sha256:23cfe892bd5dd8941608f93348c0737e369e51c100d03718f108bf1add7bd6d0",
"sha256:263cc3d821c4ab2213cbe8cd8b355a7f72a8324577dc865ef98487c1aeee2bc7",
"sha256:2756c88cbb94231c7a147402476be2c4df2f6078099a6f4a480d239a8817ae39",
"sha256:27c219baf94952ae9d50ec19651a687b826792055353d07648a5695413e0c605",
"sha256:2a23af14f408d53d5e6cd4e3d9a24ff9e05906ad574822a10563efcef137979a",
"sha256:31fb708d9d7c3f49a60f04cf5b119aeefe5644daba1cd2a0fe389b674fd1de37",
"sha256:3415c89f9204ee60cd09b235810be700e993e343a408693e80ce7f6a40108029",
"sha256:3773c4d81e6e818df2efbc7dd77325ca0dcb688116050fb2b3011218eda36139",
"sha256:3b96a311ac60a3f6be21d2572e46ce67f09abcf4d09344c49274eb9e0bf345fc",
"sha256:3f7d084648d77af029acb79a0ff49a0ad7e9d09057a9bf46596dac9514dc07df",
"sha256:41d45de54cd277a7878919867c0f08b0cf817605e4eb94093e7516505d3c8d14",
"sha256:4238e6dab5d6a8ba812de994bbb0a79bddbdf80994e4ce802b6f6f3142fcc880",
"sha256:45db3a33139e9c8f7c09234b5784a5e33d31fd6907800b316decad50af323ff2",
"sha256:45e8636704eacc432a206ac7345a5d3d2c62d95a507ec70d62f23cd91770482a",
"sha256:4958391dbd6249d7ad855b9ca88fae690783a6be9e86df65865058ed81fc860e",
"sha256:4a306fa632e8f0928956a41fa8e1d6243c71e7eb59ffbd165fc0b41e316b2474",
"sha256:57e9ac9ccc3101fac9d6014fba037473e4358ef4e89f8e181f8951a2c0162024",
"sha256:59888172256cac5629e60e72e86598027aca6bf01fa2465bdb676d37636573e8",
"sha256:5e069f72d497312b24fcc02073d70cb989045d1c91cbd53979366077959933e0",
"sha256:64d4ec9f448dfe041705426000cc13e34e6e5bb13736e9fd62e34a0b0c41566e",
"sha256:6dc2737a3674b3e344847c8686cf29e500584ccad76204efea14f451d4cc669a",
"sha256:74fdfdbfdc48d3f47148976f49fab3251e550a8720bebc99bf1483f5bfb5db3e",
"sha256:75e4024375654472cc27e91cbe9eaa08567f7fbdf822638be2814ce059f58032",
"sha256:786902fb9ba7433aae840e0ed609f45c7bcd4e225ebb9c753aa39725bb3e6ad6",
"sha256:8b6c2ea03845c9f501ed1313e78de148cd3f6cad741a75d43a29b43da27f2e1e",
"sha256:91d77d2a782be4274da750752bb1650a97bfd8f291022b379bb8e01c66b4e96b",
"sha256:91ec59c33514b7c7559a6acda53bbfe1b283949c34fe7440bcf917f96ac0723e",
"sha256:920f0d66a896c2d99f0adbb391f990a84091179542c205fa53ce5787aff87954",
"sha256:a5263e363c27b653a90078143adb3d076c1a748ec9ecc78ea2fb916f9b861962",
"sha256:abb9a20a72ac4e0fdb50dae135ba5e77880518e742077ced47eb1499e29a443c",
"sha256:c2051981a968d7de9dd2d7b87bcb9c939c74a34626a6e2f8181455dd49ed69e4",
"sha256:c21c9e3896c23007803a875460fb786118f0cdd4434359577ea25eb556e34c55",
"sha256:c2502a1a03b6312837279c8c1bd3ebedf6c12c4228ddbad40912d671ccc8a962",
"sha256:d4d692a89c5cf08a8557fdeb329b82e7bf609aadfaed6c0d79f5a449a3c7c023",
"sha256:da5db4e883f1ce37f55c667e5c0de439df76ac4cb55964655906306918e7363c",
"sha256:e7022a66d9b55e93e1a845d8c9eba2a1bebd4966cd8bfc25d9cd07d515b33fa6",
"sha256:ef1f279350da2c586a69d32fc8733092fd32cc8ac95139a00377841f59a3f8d8",
"sha256:f54a64f8b0c8ff0b64d18aa76675262e1700f3995182267998c31ae974fbc382",
"sha256:f5c7150ad32ba43a07c4479f40241756145a1f03b43480e058cfd862bf5041c7",
"sha256:f6f824dc3bce0edab5f427efcfb1d63ee75b6fcb7282900ccaf925be84efb0fc",
"sha256:fd8a250edc26254fe5b33be00402e6d287f562b6a5b2152dec302fa15bb3e997",
"sha256:ffaa5c925128e29efbde7301d8ecaf35c8c60ffbcd6a1ffd3a552177c8e5e796"
],
"version": "==1.15.0"
},
"charset-normalizer": {
"hashes": [
"sha256:2857e29ff0d34db842cd7ca3230549d1a697f96ee6d3fb071cfa6c7393832597",
"sha256:6881edbebdb17b39b4eaaa821b438bf6eddffb4468cf344f09f89def34a8b1df"
],
"markers": "python_version >= '3'",
"version": "==2.0.12"
},
"colour": {
"hashes": [
"sha256:33f6db9d564fadc16e59921a56999b79571160ce09916303d35346dddc17978c",
"sha256:af20120fefd2afede8b001fbef2ea9da70ad7d49fafdb6489025dae8745c3aee"
],
"index": "pypi",
"version": "==0.1.5"
},
"filelock": {
"hashes": [
"sha256:9cd540a9352e432c7246a48fe4e8712b10acb1df2ad1f30e8c070b82ae1fed85",
"sha256:f8314284bfffbdcfa0ff3d7992b023d4c628ced6feb957351d4c48d059f56bc0"
],
"markers": "python_version >= '3.7'",
"version": "==3.6.0"
},
"hamlish-jinja": {
"hashes": [
"sha256:4a694a0eb51d0ab7237a96dd7d87aab478bcbdd112b9601097f9731ed7bd173c"
],
"index": "pypi",
"version": "==0.3.3"
},
"http-router": {
"hashes": [
"sha256:0a754392dff0169489daebbc6dc521a452347dff4b56aeea631087dafd8d2223",
"sha256:0bf70959e6fe412b58d010af07310f4d703af27df23b2a13caaf7c69e5499d86",
"sha256:0eb96e9148ab1500f825e6f6b40c20571dcbb7ac7846e52414305b24fb9f6779",
"sha256:3a4cc60da8fe811efb8a5cfc0e55ee205b7a0b08c48074c6f05afc87f6bca6c0",
"sha256:3a675ca74f3c5b2e8532d4a3a5f9441eded46313776be17595e7e53bb9c3bb2f",
"sha256:3d8ba7209051c03b6e78e304774966fc0177850a0f836f307ab79174816f15a5",
"sha256:508ecb308bae9a9f42cf4a48245fae027aaf826211808c9da0fca6c851f15e3d",
"sha256:74e2cf082d7b3bb5c9837b0c583837e6accd77f2ec1fe69aa0a90751c546a123",
"sha256:814694ca90adb79793ae8822d5ab6bf39631f3e35526dafd1f54c8ecfa26c54f",
"sha256:8d4d635aacbc8d110a0ee6331ca5feaab27e77ef1a6841a2c994fe49f8a18458",
"sha256:91d6615d25d28420d8e701d14e50fc8c5e2e6ca05dc609f896852304f49f3dac",
"sha256:9c0e9e324c26ffbac83cc588a6141bca31e875ef9558a273109d4e159684b380",
"sha256:a03ad3addba310f9ea5a0a52fb852e27a19dabff95419e1661a93b56d70c0d51",
"sha256:c37a415951532c3db79e69412bd290c9dafa19d2aad4180c24eaa114a45604b3",
"sha256:cb6cc3da118eb4a41e566632d5dc5860fbbd9b58bf836fb1c9ceec19fffa35fa",
"sha256:dd1a72901b30dd65b72556e291b992735784730e8802b0e943e0aafea110bf80",
"sha256:dd2c513e3fd385a3b15a6ff3fd59f15890bccfa41746c5b305d898e5a66e6d0e",
"sha256:e531cc214c63bb640a0cd03c73c83601d5ee69ffc64f763bd905b024b9b7c3eb",
"sha256:efe4280a999d2d7b68062a085e3d12a73b6a6551f1da546a4ac3bcdc793eae0b",
"sha256:fb7455d318f43abe26a525d7963cdacb9b0763b5e9fdd04054af2d2e2b60dc73"
],
"index": "pypi",
"version": "==2.6.5"
},
"idna": {
"hashes": [
"sha256:84d9dd047ffa80596e0f246e2eab0b391788b0503584e8945f2368256d2735ff",
"sha256:9d643ff0a55b762d5cdb124b8eaa99c66322e2157b69160bc32796e824360e6d"
],
"markers": "python_version >= '3.5'",
"version": "==3.3"
},
"importlib-metadata": {
"hashes": [
"sha256:175f4ee440a0317f6e8d81b7f8d4869f93316170a65ad2b007d2929186c8052c",
"sha256:e0bc84ff355328a4adfc5240c4f211e0ab386f80aa640d1b11f0618a1d282094"
],
"markers": "python_version < '3.10'",
"version": "==4.11.1"
},
"jinja2": {
"hashes": [
"sha256:077ce6014f7b40d03b47d1f1ca4b0fc8328a692bd284016f806ed0eaca390ad8",
"sha256:611bb273cd68f3b993fabdc4064fc858c5b47a973cb5aa7999ec1ba405c87cd7"
],
"index": "pypi",
"version": "==3.0.3"
},
"markdown": {
"hashes": [
"sha256:76df8ae32294ec39dcf89340382882dfa12975f87f45c3ed1ecdb1e8cefc7006",
"sha256:9923332318f843411e9932237530df53162e29dc7a4e2b91e35764583c46c9a3"
],
"index": "pypi",
"version": "==3.3.6"
},
"markupsafe": {
"hashes": [
"sha256:023af8c54fe63530545f70dd2a2a7eed18d07a9a77b94e8bf1e2ff7f252db9a3",
"sha256:09c86c9643cceb1d87ca08cdc30160d1b7ab49a8a21564868921959bd16441b8",
"sha256:142119fb14a1ef6d758912b25c4e803c3ff66920635c44078666fe7cc3f8f759",
"sha256:1d1fb9b2eec3c9714dd936860850300b51dbaa37404209c8d4cb66547884b7ed",
"sha256:204730fd5fe2fe3b1e9ccadb2bd18ba8712b111dcabce185af0b3b5285a7c989",
"sha256:24c3be29abb6b34052fd26fc7a8e0a49b1ee9d282e3665e8ad09a0a68faee5b3",
"sha256:290b02bab3c9e216da57c1d11d2ba73a9f73a614bbdcc027d299a60cdfabb11a",
"sha256:3028252424c72b2602a323f70fbf50aa80a5d3aa616ea6add4ba21ae9cc9da4c",
"sha256:30c653fde75a6e5eb814d2a0a89378f83d1d3f502ab710904ee585c38888816c",
"sha256:3cace1837bc84e63b3fd2dfce37f08f8c18aeb81ef5cf6bb9b51f625cb4e6cd8",
"sha256:4056f752015dfa9828dce3140dbadd543b555afb3252507348c493def166d454",
"sha256:454ffc1cbb75227d15667c09f164a0099159da0c1f3d2636aa648f12675491ad",
"sha256:598b65d74615c021423bd45c2bc5e9b59539c875a9bdb7e5f2a6b92dfcfc268d",
"sha256:599941da468f2cf22bf90a84f6e2a65524e87be2fce844f96f2dd9a6c9d1e635",
"sha256:5ddea4c352a488b5e1069069f2f501006b1a4362cb906bee9a193ef1245a7a61",
"sha256:62c0285e91414f5c8f621a17b69fc0088394ccdaa961ef469e833dbff64bd5ea",
"sha256:679cbb78914ab212c49c67ba2c7396dc599a8479de51b9a87b174700abd9ea49",
"sha256:6e104c0c2b4cd765b4e83909cde7ec61a1e313f8a75775897db321450e928cce",
"sha256:736895a020e31b428b3382a7887bfea96102c529530299f426bf2e636aacec9e",
"sha256:75bb36f134883fdbe13d8e63b8675f5f12b80bb6627f7714c7d6c5becf22719f",
"sha256:7d2f5d97fcbd004c03df8d8fe2b973fe2b14e7bfeb2cfa012eaa8759ce9a762f",
"sha256:80beaf63ddfbc64a0452b841d8036ca0611e049650e20afcb882f5d3c266d65f",
"sha256:84ad5e29bf8bab3ad70fd707d3c05524862bddc54dc040982b0dbcff36481de7",
"sha256:8da5924cb1f9064589767b0f3fc39d03e3d0fb5aa29e0cb21d43106519bd624a",
"sha256:961eb86e5be7d0973789f30ebcf6caab60b844203f4396ece27310295a6082c7",
"sha256:96de1932237abe0a13ba68b63e94113678c379dca45afa040a17b6e1ad7ed076",
"sha256:a0a0abef2ca47b33fb615b491ce31b055ef2430de52c5b3fb19a4042dbc5cadb",
"sha256:b2a5a856019d2833c56a3dcac1b80fe795c95f401818ea963594b345929dffa7",
"sha256:b8811d48078d1cf2a6863dafb896e68406c5f513048451cd2ded0473133473c7",
"sha256:c532d5ab79be0199fa2658e24a02fce8542df196e60665dd322409a03db6a52c",
"sha256:d3b64c65328cb4cd252c94f83e66e3d7acf8891e60ebf588d7b493a55a1dbf26",
"sha256:d4e702eea4a2903441f2735799d217f4ac1b55f7d8ad96ab7d4e25417cb0827c",
"sha256:d5653619b3eb5cbd35bfba3c12d575db2a74d15e0e1c08bf1db788069d410ce8",
"sha256:d66624f04de4af8bbf1c7f21cc06649c1c69a7f84109179add573ce35e46d448",
"sha256:e67ec74fada3841b8c5f4c4f197bea916025cb9aa3fe5abf7d52b655d042f956",
"sha256:e6f7f3f41faffaea6596da86ecc2389672fa949bd035251eab26dc6697451d05",
"sha256:f02cf7221d5cd915d7fa58ab64f7ee6dd0f6cddbb48683debf5d04ae9b1c2cc1",
"sha256:f0eddfcabd6936558ec020130f932d479930581171368fd728efcfb6ef0dd357",
"sha256:fabbe18087c3d33c5824cb145ffca52eccd053061df1d79d4b66dafa5ad2a5ea",
"sha256:fc3150f85e2dbcf99e65238c842d1cfe69d3e7649b19864c1cc043213d9cd730"
],
"markers": "python_version >= '3.7'",
"version": "==2.1.0"
},
"pycparser": {
"hashes": [
"sha256:8ee45429555515e1f6b185e78100aea234072576aa43ab53aefcae078162fca9",
"sha256:e644fdec12f7872f86c58ff790da456218b10f863970249516d60a5eaca77206"
],
"version": "==2.21"
},
"pycryptodome": {
"hashes": [
"sha256:028dcbf62d128b4335b61c9fbb7dd8c376594db607ef36d5721ee659719935d5",
"sha256:12ef157eb1e01a157ca43eda275fa68f8db0dd2792bc4fe00479ab8f0e6ae075",
"sha256:2562de213960693b6d657098505fd4493c45f3429304da67efcbeb61f0edfe89",
"sha256:27e92c1293afcb8d2639baf7eb43f4baada86e4de0f1fb22312bfc989b95dae2",
"sha256:36e3242c4792e54ed906c53f5d840712793dc68b726ec6baefd8d978c5282d30",
"sha256:50a5346af703330944bea503106cd50c9c2212174cfcb9939db4deb5305a8367",
"sha256:53dedbd2a6a0b02924718b520a723e88bcf22e37076191eb9b91b79934fb2192",
"sha256:69f05aaa90c99ac2f2af72d8d7f185f729721ad7c4be89e9e3d0ab101b0ee875",
"sha256:75a3a364fee153e77ed889c957f6f94ec6d234b82e7195b117180dcc9fc16f96",
"sha256:766a8e9832128c70012e0c2b263049506cbf334fb21ff7224e2704102b6ef59e",
"sha256:7fb90a5000cc9c9ff34b4d99f7f039e9c3477700e309ff234eafca7b7471afc0",
"sha256:893f32210de74b9f8ac869ed66c97d04e7d351182d6d39ebd3b36d3db8bda65d",
"sha256:8b5c28058102e2974b9868d72ae5144128485d466ba8739abd674b77971454cc",
"sha256:924b6aad5386fb54f2645f22658cb0398b1f25bc1e714a6d1522c75d527deaa5",
"sha256:9924248d6920b59c260adcae3ee231cd5af404ac706ad30aa4cd87051bf09c50",
"sha256:9ec761a35dbac4a99dcbc5cd557e6e57432ddf3e17af8c3c86b44af9da0189c0",
"sha256:a36ab51674b014ba03da7f98b675fcb8eabd709a2d8e18219f784aba2db73b72",
"sha256:aae395f79fa549fb1f6e3dc85cf277f0351e15a22e6547250056c7f0c990d6a5",
"sha256:c880a98376939165b7dc504559f60abe234b99e294523a273847f9e7756f4132",
"sha256:ce7a875694cd6ccd8682017a7c06c6483600f151d8916f2b25cf7a439e600263",
"sha256:d1b7739b68a032ad14c5e51f7e4e1a5f92f3628bba024a2bda1f30c481fc85d8",
"sha256:dcd65355acba9a1d0fc9b923875da35ed50506e339b35436277703d7ace3e222",
"sha256:e04e40a7f8c1669195536a37979dd87da2c32dbdc73d6fe35f0077b0c17c803b",
"sha256:e0c04c41e9ade19fbc0eff6aacea40b831bfcb2c91c266137bcdfd0d7b2f33ba",
"sha256:e24d4ec4b029611359566c52f31af45c5aecde7ef90bf8f31620fd44c438efe7",
"sha256:e64738207a02a83590df35f59d708bf1e7ea0d6adce712a777be2967e5f7043c",
"sha256:ea56a35fd0d13121417d39a83f291017551fa2c62d6daa6b04af6ece7ed30d84",
"sha256:f2772af1c3ef8025c85335f8b828d0193fa1e43256621f613280e2c81bfad423",
"sha256:f403a3e297a59d94121cb3ee4b1cf41f844332940a62d71f9e4a009cc3533493",
"sha256:f572a3ff7b6029dd9b904d6be4e0ce9e309dcb847b03e3ac8698d9d23bb36525"
],
"index": "pypi",
"version": "==3.14.1"
},
"python-magic": {
"hashes": [
"sha256:1a2c81e8f395c744536369790bd75094665e9644110a6623bcc3bbea30f03973",
"sha256:21f5f542aa0330f5c8a64442528542f6215c8e18d2466b399b0d9d39356d83fc"
],
"index": "pypi",
"version": "==0.4.25"
},
"requests": {
"hashes": [
"sha256:68d7c56fd5a8999887728ef304a6d12edc7be74f1cfa47714fc8b414525c9a61",
"sha256:f22fa1e554c9ddfd16e6e41ac79759e17be9e492b3587efa038054674760e72d"
],
"markers": "python_version >= '2.7' and python_version not in '3.0, 3.1, 3.2, 3.3, 3.4, 3.5'",
"version": "==2.27.1"
},
"requests-file": {
"hashes": [
"sha256:07d74208d3389d01c38ab89ef403af0cfec63957d53a0081d8eca738d0247d8e",
"sha256:dfe5dae75c12481f68ba353183c53a65e6044c923e64c24b2209f6c7570ca953"
],
"version": "==1.5.1"
},
"six": {
"hashes": [
"sha256:1e61c37477a1626458e36f7b1d82aa5c9b094fa4802892072e49de9c60c4c926",
"sha256:8abb2f1d86890a2dfb989f9a77cfcfd3e47c2a354b01111771326f8aa26e0254"
],
"markers": "python_version >= '2.7' and python_version not in '3.0, 3.1, 3.2, 3.3'",
"version": "==1.16.0"
},
"tldextract": {
"hashes": [
"sha256:d2034c3558651f7d8fdadea83fb681050b2d662dc67a00d950326dc902029444",
"sha256:f55e05f6bf4cc952a87d13594386d32ad2dd265630a8bdfc3df03bd60425c6b0"
],
"index": "pypi",
"version": "==3.1.2"
},
"urllib3": {
"hashes": [
"sha256:000ca7f471a233c2251c6c7023ee85305721bfdf18621ebff4fd17a8653427ed",
"sha256:0e7c33d9a63e7ddfcb86780aac87befc2fbddf46c58dbb487e0855f7ceec283c"
],
"markers": "python_version >= '2.7' and python_version not in '3.0, 3.1, 3.2, 3.3, 3.4' and python_version < '4'",
"version": "==1.26.8"
},
"zipp": {
"hashes": [
"sha256:9f50f446828eb9d45b267433fd3e9da8d801f614129124863f9c51ebceafb87d",
"sha256:b47250dd24f92b7dd6a0a8fc5244da14608f3ca90a5efcd37a3b1642fac9a375"
],
"markers": "python_version >= '3.7'",
"version": "==3.7.0"
}
},
"develop": {}
}

7
README.md Normal file
View file

@ -0,0 +1,7 @@
# Barkshark Async HTTP
A simple HTTP client and server framework based on the built-in asyncio module.
# NOTE
Not in a stable state yet. Expect major changes.

30
barkshark_sql/__init__.py Normal file
View file

@ -0,0 +1,30 @@
from .database import Database, Connection
from .exceptions import *
from .result import Result
from .row import Row
from .session import Session
from .statements import Comparison, Statement, Select, Insert, Update, Delete, Count
from .table import Column
__all__ = [
'Column',
'Comparison',
'Connection',
'Count',
'Database',
'Delete',
'Insert',
'Result',
'Row',
'Select',
'Session',
'Statement',
'Update'
'NoConnectionError',
'MaxConnectionsError',
'NoTransactionError',
'NoTableLayoutError',
'UpdateAllRowsError'
]

103
barkshark_sql/config.py Normal file
View file

@ -0,0 +1,103 @@
import sqlite3
from getpass import getuser
from importlib import import_module
from izzylib import (
BaseConfig,
Path
)
from .result import Result
from .row import Row
from .session import Session
class Config(BaseConfig):
def __init__(self, **kwargs):
super().__init__(
appname = 'IzzyLib SQL Client',
type = 'sqlite',
module = None,
module_name = None,
tables = {},
row_classes = {},
session_class = Session,
result_class = Result,
host = 'localhost',
port = 0,
database = None,
username = getuser(),
password = None,
minconnections = 4,
maxconnections = 25,
engine_args = {},
auto_trans = True,
connect_function = None,
autocommit = False
)
for k, v in kwargs.items():
self[k] = v
if not self.database:
if self.type == 'sqlite':
self.database = ':memory:'
else:
raise ValueError('Missing database name')
if not self.port:
if self.type == 'postgresql':
self.port = 5432
elif self.type == 'mysql':
self.port = 3306
if not self.module and not self.connect_function:
if self.type == 'sqlite':
self.module = sqlite3
self.module_name = 'sqlite3'
elif self.type == 'postgresql':
for mod in ['pg8000.dbapi', 'pgdb', 'psycopg2']:
try:
self.module = import_module(mod)
self.module_name = mod
break
except ImportError:
pass
elif self.type == 'mysql':
try:
self.module = import_module('mysql.connector')
self.module_name = 'mysql.connector'
except ImportError:
pass
if not self.module:
raise ImportError(f'Cannot find module for "{self.type}"')
self.module.paramstyle = 'qmark'
@property
def dbargs(self):
return {key: self[key] for key in ['host', 'port', 'database', 'username', 'password']}
def parse_value(self, key, value):
if key == 'type':
if value not in ['sqlite', 'postgresql', 'mysql', 'mssql']:
raise ValueError(f'Invalid database type: {value}')
if key == 'port':
if not isinstance(value, int):
raise TypeError('Port is not an integer')
if key == 'row_classes':
for row_class in value.values():
if not issubclass(row_class, Row):
raise TypeError(f'Row classes must be izzylib.sql2.row.Row, not {row_class.__name__}')
return value

256
barkshark_sql/database.py Normal file
View file

@ -0,0 +1,256 @@
import itertools
from izzylib import (
DotDict,
Path,
izzylog
)
from .config import Config
from .row import Row
from .table import DbTables
from .types import Types
from .exceptions import (
MaxConnectionsError,
NoTableLayoutError,
NoConnectionError
)
class Database:
def __init__(self, autoconnect=True, app=None, **kwargs):
self.cfg = Config(**kwargs)
self.tables = DbTables(self)
self.cache = None
self.types = Types(self)
self.connections = []
self.app = app
if self.cfg.tables:
self.load_tables(self.cfg.tables)
if autoconnect:
self.connect()
def connect(self):
for _ in itertools.repeat(None, self.cfg.minconnections):
self.get_connection()
def disconnect(self):
for conn in self.connections:
conn.disconnect()
self.connections = []
@property
def session(self):
return self.get_connection().session
def new_connection(self):
if len(self.connections) >= self.cfg.maxconnections:
raise MaxConnectionsError('Too many connections')
conn = Connection(self)
conn.connect()
self.connections.append(conn)
return conn
def close_connection(self, conn):
conn.close_sessions()
conn.disconnect()
if not conn.conn:
try: self.connections.remove(conn)
except: pass
def get_connection(self):
if not len(self.connections):
return self.new_connection()
if len(self.connections) < self.cfg.minconnections:
return self.new_connection()
for conn in self.connections:
if not len(conn.sessions):
return conn
if len(self.connections) < self.cfg.maxconnections:
return self.new_connection()
conns = {(conn, len(conn.sessions)) for conn in self.connections}
return min(conns, key=lambda x: x[1])[0]
def new_predb(self, database='postgres'):
dbconfig = Config(**self.cfg)
dbconfig['database'] = database
dbconfig['autocommit'] = True
return Database(**dbconfig)
def set_row_class(self, name, row_class):
if not issubclass(row_class, Row):
raise TypeError(f'Row classes must be izzylib.sql2.row.Row, not {row_class.__name__}')
self.cfg.row_classes[name] = row_class
def get_row_class(self, name):
return self.cfg.row_classes.get(name, Row)
def load_tables(self, tables=None):
if tables:
self.tables.load_tables(tables)
return self.tables
with self.session as s:
self.tables.load_tables(s.table_layout())
return self.tables
def create_tables(self, tables):
if not tables:
raise ValueError('No table layout provided')
if self.tables.empty:
self.load_tables(tables)
with self.session as s:
for table in self.tables.names:
s.execute(self.tables.compile_table(table, self.cfg.type))
def create_database(self, tables=None):
if self.cfg.type == 'postgresql':
with self.new_predb().session as s:
if not s.raw_execute('SELECT datname FROM pg_database WHERE datname = ?', [self.cfg.database]).fetchone():
s.raw_execute(f'CREATE DATABASE {self.cfg.database}')
elif self.cfg.type != 'sqlite':
raise NotImplementedError(f'Database type not supported yet: {self.cfg.type}')
self.create_tables(tables or self.cfg.tables)
def drop_database(self, database):
if self.cfg.type == 'sqlite':
izzylog.verbose('drop_database not needed for SQLite')
return
with self.session as s:
if self.cfg.type == 'postgresql':
s.raw_execute(f'DROP DATABASE {database}')
else:
raise NotImplementedError(f'Database type not supported yet: {self.cfg.type}')
class Connection:
def __init__(self, db):
self.db = db
self.cfg = db.cfg
self.sessions = []
self.conn = None
self.connect()
if db.tables.empty:
with self.session as s:
db.load_tables(s.table_layout())
@property
def autocommit(self):
return self.conn.autocommit
@property
def session(self):
return self.cfg.session_class(self)
def connect(self):
if self.conn:
return
dbconfig = self.cfg.dbargs
if self.cfg.type == 'sqlite':
if self.cfg.autocommit:
self.conn = self.cfg.module.connect(dbconfig['database'], isolation_level=None, **self.cfg.engine_args)
else:
self.conn = self.cfg.module.connect(dbconfig['database'], **self.cfg.engine_args)
elif self.cfg.type == 'postgresql':
if Path(self.cfg.host).exists():
dbconfig['unix_sock'] = dbconfig.pop('host')
dbconfig['user'] = dbconfig.pop('username')
dbconfig['application_name'] = self.cfg.appname
self.conn = self.cfg.module.connect(**dbconfig, **self.cfg.engine_args)
else:
self.conn = self.cfg.module.connect(**self.cfg.dbargs, **self.cfg.engine_args)
try:
self.conn.autocommit = self.cfg.autocommit
except AttributeError:
if self.cfg.module_name not in ['sqlite']:
izzylog.verbose('Module does not support autocommit:', self.cfg.module_name)
return self.conn
def disconnect(self):
if not self.conn:
return
self.close_sessions()
self.conn.close()
self.conn = None
def close_sessions(self):
for session in self.sessions:
self.close_session(session)
def close_session(session):
try: self.sessions.remove(session)
except: pass
session.close()
if not len(self.sessions) and len(self.db.connections) > self.cfg.minconnections:
self.disconnect()
def cursor(self):
if not self.conn:
raise
return self.conn.cursor()
def dump_database(self, path='database.sql'):
if self.cfg.type == 'sqlite':
path = Path(path)
with path.open('w') as fd:
fd.write('\n\n'.join(list(self.conn.iterdump())[1:-1]))
else:
raise NotImplementedError('Only SQLite supported atm :/')

View file

@ -0,0 +1,27 @@
__all__ = [
'NoConnectionError',
'MaxConnectionsError',
'NoTransactionError',
'NoTableLayoutError',
'UpdateAllRowsError'
]
class NoConnectionError(Exception):
'Raise when a function requiring a connection gets called when there is no connection'
class MaxConnectionsError(Exception):
'Raise when the max amount of connections has been reached'
class NoTransactionError(Exception):
'Raise when trying to execute an SQL write statement outside a transaction'
class NoTableLayoutError(Exception):
'Raise when a table layout is necessary, but not loaded'
class UpdateAllRowsError(Exception):
'Raise when an UPDATE tries to modify all rows in a table'

68
barkshark_sql/result.py Normal file
View file

@ -0,0 +1,68 @@
from .row import Row
class Result:
def __init__(self, session):
self.table = None
self.session = session
self.cursor = session.cursor
try:
self.keys = [desc[0] for desc in session.cursor.description]
except TypeError:
self.keys = []
def __iter__(self):
yield from self.all_iter()
@property
def row_class(self):
return self.session.db.get_row_class(self.table)
@property
def last_row_id(self):
if self.session.cfg.type == 'postgresql':
try:
return self.one().id
except:
return None
return self.cursor.lastrowid
@property
def row_count(self):
return self.cursor.rowcount
def set_table(self, table):
self.table = table
def one(self):
data = self.cursor.fetchone()
if not data:
return
return self.row_class(
self.session,
self.table,
{self.keys[idx]: value for idx, value in enumerate(data)},
)
def all(self):
return [row for row in self.all_iter()]
def all_iter(self):
for row in self.cursor:
yield self.row_class(self.session, self.table,
{self.keys[idx]: value for idx, value in enumerate(row)}
)

34
barkshark_sql/row.py Normal file
View file

@ -0,0 +1,34 @@
from izzylib import DotDict
class Row(DotDict):
def __init__(self, session, table, data):
super().__init__(session._parse_data('serialize', table, data))
self._table = table
self._session = session
self.__run__(session)
def __run__(self, session):
pass
@property
def session(self):
return self._session
@property
def table(self):
return self._table
@property
def rowid(self):
return self.id
@property
def rowid2(self):
return self.get('rowid', self.id)

416
barkshark_sql/session.py Normal file
View file

@ -0,0 +1,416 @@
import json
from pathlib import Path as PyPath
from .result import Result
from .row import Row
from izzylib import (
DotDict,
Path,
boolean,
izzylog,
random_gen
)
from .exceptions import (
NoTransactionError,
UpdateAllRowsError
)
from .statements import (
Count,
Delete,
Insert,
Select,
Statement,
Update
)
from .table import (
SessionTables,
DbColumn
)
class Session:
def __init__(self, conn):
self.db = conn.db
self.cfg = conn.db.cfg
self.conn = conn
self.sid = random_gen()
self.tables = SessionTables(self)
self.cursor = conn.cursor()
self.trans = False
self.__setup__()
def __enter__(self):
return self
def __exit__(self, exctype, excvalue, traceback):
if traceback:
self.rollback()
else:
self.commit()
def __setup__(self):
pass
@property
def cache(self):
return self.db.cache
def close():
if not self.cursor:
return
self.conn.close_session(self)
def _parse_data(self, action, table, kwargs):
data = {}
if self.db.tables:
for key, value in kwargs.items():
try:
coltype = self.db.tables[table][key].type
except KeyError:
data[key] = value
continue
parser = self.db.types.get_type(coltype)
try:
data[key] = parser(action, self.cfg.type, value)
except Exception as e:
izzylog.error(f'Failed to parse data from the table "{table}": {key} = {value}')
izzylog.debug(f'Parser: {parser}, Type: {coltype}')
raise e from None
else:
data = kwargs
return data
def dump_database(self, path):
import sqlparse
path = Path(path)
with path.open('w') as fd:
line = '\n\n'.join(list(self.conn.iterdump())[1:-1])
fd.write(sqlparse.format(line,
reindent = False,
keyword_case = 'upper',
))
def dump_database2(self, path):
path = Path(path)
with path.open('w') as fd:
fd.write('\n\n'.join(list(self.conn.iterdump())[1:-1]))
def begin(self):
if self.trans or self.cfg.autocommit:
return
self.execute('BEGIN')
self.trans = True
def commit(self):
if not self.trans:
return
self.execute('COMMIT')
self.trans = False
def rollback(self):
if not self.trans:
return
self.execute('ROLLBACK')
self.trans = False
def raw_execute(self, string, values=None):
if type(string) == Path:
string = string.read()
elif type(string) == PyPath:
with string.open() as fd:
string = fd.read()
if values:
self.cursor.execute(string, values)
else:
self.cursor.execute(string)
return self.cursor
def execute(self, string, *values):
if isinstance(string, Statement):
raise TypeError('String must be a str not a Statement')
action = string.split()[0].upper()
if not self.trans and action in ['CREATE', 'INSERT', 'UPDATE', 'UPSERT', 'DROP', 'DELETE', 'ALTER']:
if self.cfg.auto_trans:
self.begin()
else:
raise NoTransactionError(f'Command not supported outside a transaction: {action}')
try:
self.raw_execute(string, values)
except Exception as e:
if type(e).__name__ in ['DatabaseError', 'OperationalError']:
print(string, values)
raise e from None
return Result(self)
def run(self, query):
result = self.execute(query.compile(self.cfg.type), *query.values)
if type(query) == Count:
return list(result.one().values())[0]
result.set_table(query.table)
return result
def run_count(self, query):
return list(self.run(query).one().values())[0]
def count(self, table, **kwargs):
if self.db.tables and table not in self.db.tables:
raise KeyError(f'Table does not exist: {table}')
query = Count(table, **kwargs)
return self.run_count(query)
def fetch(self, table, orderby=None, orderdir='ASC', limit=None, offset=None, **kwargs):
if self.db.tables and table not in self.db.tables:
raise KeyError(f'Table does not exist: {table}')
query = Select(table, **kwargs)
if orderby:
query.order(orderby, orderdir)
if limit:
query.limit(limist)
if offset:
query.offset(offset)
return self.run(query)
def insert(self, table, return_row=False, **kwargs):
if self.db.tables and table not in self.db.tables:
raise KeyError(f'Table does not exist: {table}')
result = self.run(Insert(table, **self._parse_data('deserialize', table, kwargs)))
if return_row:
return self.fetch(table, id=result.last_row_id).one()
return result.last_row_id
def update(self, table, data, return_row=False, **kwargs):
query = Update(table, **data)
for pair in kwargs.items():
query.where(*pair)
if not query._where:
raise UpdateAllRowsError(f'Refusing to update all rows in table: {table}')
result = self.run(query)
if return_row:
return self.fetch(table, id=result.last_row_id).one()
else:
return result
def update_row(self, row, return_row=False, **kwargs):
return self.update(row.table, kwargs, id=row.id, return_row=return_row)
def remove(self, table, **kwargs):
if self.db.tables and table not in self.db.tables:
raise KeyError(f'Table does not exist: {table}')
self.run(Delete(table, self._parse_data('deserialize', table, kwargs)))
def remove_row(self, row):
if not row.table:
raise ValueError('Row not associated with a table')
self.remove(row.table, id=row.id)
def create_table(self, table):
self.execute(self.tables[table].compile(self.cfg.type))
def drop_table(self, table):
self.execute(f'DROP TABLE {table}')
def alter_table(self, table):
return AlterTable(self, table)
def create_tables(self, tables=None):
if tables:
self.load_tables(tables)
if not self.tables:
raise NoTableLayoutError('No table layout available')
for table in self.tables.keys():
self.create_table(table)
def table_layout(self):
tables = {}
if self.cfg.type == 'sqlite':
rows = self.execute("SELECT name, sql FROM sqlite_master WHERE type = 'table' AND name NOT LIKE 'sqlite_%'")
for row in rows:
name = row.name
tables[name] = {}
fkeys = {fkey['from']: f'{fkey.table}.{fkey["to"]}' for fkey in self.execute(f'PRAGMA foreign_key_list({name})')}
columns = [col for col in self.execute(f'PRAGMA table_info({name})')]
unique_list = parse_unique(row.sql)
for column in columns:
tables[name][column.name] = dict(
type = column.type.upper(),
nullable = not column.notnull,
default = parse_default(column.dflt_value),
primary_key = bool(column.pk),
foreign_key = fkeys.get(column.name),
unique = column.name in unique_list
)
elif self.cfg.type == 'postgresql':
for row in self.execute("SELECT * FROM information_schema.columns WHERE table_schema not in ('information_schema', 'pg_catalog') ORDER BY table_schema, table_name, ordinal_position"):
table = row.table_name
column = row.column_name
if not tables.get(table):
tables[table] = {}
if not tables[table].get(column):
tables[table][column] = {}
tables[table][column] = dict(
type = row.data_type.upper(),
nullable = boolean(row.is_nullable),
default = row.column_default if row.column_default and not row.column_default.startswith('nextval') else None,
primary_key = None,
foreign_key = None,
unique = None
)
return tables
class AlterTable:
def __init__(self, session, table):
self.session = session
self.table = table
def _execute(self, sql):
return self.session.execute(f"ALTER TABLE '{self.table}' {sql}")
def rename_table(self, new_name):
return self._execute(f"RENAME TO '{new_name}'")
def rename_column(self, column, new_name):
return self._execute(f"RENAME COLUMN '{column}' TO '{new_name}'")
def add_column(self, *args, dbtype='sqlite', **kwargs):
col = DbColumn(dbtype, *args, **kwargs)
if col.nullable and not col.default:
raise ValueError('A default value must be provided for nullable columns')
if col.foreign_key and col.default != None:
raise ValueError('A default value of "None" must be used if a foreign key is set')
return self._execute(f"ADD COLUMN {col.compile()}")
def drop_column(self, column):
return self._execute(f"DROP TABLE '{column}'")
def parse_unique(sql):
unique_list = []
try:
for raw_line in sql.splitlines():
if 'UNIQUE' not in raw_line:
continue
for line in raw_line.replace('UNIQUE', '').replace('(', '').replace(')', '').split(','):
line = line.strip()
if line:
unique_list.append(line)
except IndexError:
pass
return unique_list
def parse_default(value):
if value == None:
return
if value.startswith("'") and value.endswith("'"):
value = value[1:-1]
else:
try:
value = int(value)
except ValueError:
pass
return value

View file

@ -0,0 +1,11 @@
import asyncio
from .database import Database
class client:
pass
class Server:
pass

216
barkshark_sql/statements.py Normal file
View file

@ -0,0 +1,216 @@
from izzylib import DotDict
from .table import DbColumn
Comparison = DotDict(
LESS = lambda key: f'{key} < ?',
GREATER = lambda key: f'{key} > ?',
LESS_EQUAL = lambda key: f'{key} <= ?',
GREATER_EQUAL = lambda key: f'{key} >= ?',
EQUAL = lambda key: f'{key} = ?',
NOT_EQUAL = lambda key: f'{key} != ?',
IN = lambda key: f'{key} IN (?)',
NOT_IN = lambda key: f'{key} NOT IN (?)',
LIKE = lambda key: f'{key} LIKE ?',
NOT_LIKE = lambda key: f'{key} NOT LIKE ?'
)
class Statement:
def __init__(self, table):
self.table = table
self.values = []
self._where = ''
self._order = None
self._limit = None
self._offset = None
def __str__(self):
return self.compile('sqlite')
def where(self, key, value, comparison='equal', operator='and'):
try:
comp = Comparison[comparison.upper().replace('-', '_')]
except KeyError:
raise KeyError(f'Invalid comparison: {comparison}')
prefix = f' {operator} ' if self._where else ' '
self._where += f'{prefix}{comp(key)}'
self.values.append(value)
return self
def order(self, column, direction='ASC'):
direction = direction.upper()
assert direction in ['ASC', 'DESC']
self._order = (column, direction)
return self
def limit(self, limit_num):
self._limit = int(limit_num)
return self
def offset(self, offset_num):
self._offset = int(offset_num)
return self
def compile(self, dbtype):
raise NotImplementedError('Do not use the Statement class directly.')
class Select(Statement):
def __init__(self, table, *columns, **kwargs):
super().__init__(table)
self.columns = columns
for key, value in kwargs.items():
self.where(key, value)
def compile(self, dbtype):
data = f'SELECT'
if self.columns:
columns = ','.join(self.columns)
else:
columns = '*'
data += f' {columns} FROM {self.table}'
if self._where:
data += f' WHERE {self._where}'
if self._order:
col, direc = self._order
data += f' ORDER BY "{col}" {direc}'
if self._limit:
data += f' LIMIT {self._limit}'
if self._offset:
data += f' OFFSET {self._offset}'
return data
class Insert(Statement):
def __init__(self, table, **kwargs):
super().__init__(table)
self.keys = []
for pair in kwargs.items():
self.add_data(*pair)
def add_data(self, key, value):
self.keys.append(key)
self.values.append(value)
def remove_data(self, key):
index = self.keys.index(key)
del self.keys[index]
del self.values[index]
def compile(self, dbtype):
keys = ','.join([f'"{key}"' for key in self.keys])
values = ','.join('?' for value in self.values)
data = f'INSERT INTO {self.table} ({keys}) VALUES ({values})'
if dbtype == 'postgresql':
data += f' RETURNING id'
return data
class Update(Statement):
def __init__(self, table, **kwargs):
super().__init__(table)
self.keys = []
for key, value in kwargs.items():
self.keys.append(key)
self.values.append(value)
def compile(self, dbtype):
pairs = ','.join(f'"{key}" = ?' for key in self.keys)
data = f'UPDATE {self.table} SET {pairs} WHERE {self._where}'
if dbtype == 'postgresql':
data += f' RETURNING id'
return data
class Delete(Statement):
def __init__(self, table, **kwargs):
super().__init__(table)
for key, value in kwargs.items():
self.where(key, value)
def compile(self, dbtype):
return f'DELETE FROM {self.table} WHERE {self._where}'
class Count(Statement):
def __init__(self, table, **kwargs):
super().__init__(table)
for key, value in kwargs.items():
self.where(key, value)
def compile(self, dbtype):
data = f'SELECT COUNT(*) FROM {self.table}'
if self._where:
data += f' WHERE {self._where}'
return data
class AlterTable(Statement):
def compile(self, sql):
return f"ALTER TABLE '{self.table}' {sql}"
def rename(self, new_name):
return self.compile(f"RENAME TO '{new_name}'")
def rename_column(table, column, new_name):
return self.compile(f"RENAME COLUMN '{column}' TO '{new_name}'")
def add_column(table, *args, dbtype='sqlite', **kwargs):
col = DbColumn(dbtype, *args, **kwargs)
if col.nullable and not col.default:
raise ValueError('A default value must be provided for nullable columns')
if col.foreign_key and col.default != None:
raise ValueError('A default value of "None" must be used if a foreign key is set')
return self.compile(f"ADD COLUMN {col.compile()}")
def drop_column(table, column):
return self.compile(f"DROP TABLE '{column}'")

290
barkshark_sql/table.py Normal file
View file

@ -0,0 +1,290 @@
from izzylib import (
DotDict,
LruCache
)
class SessionTables:
def __init__(self, session):
self._session = session
self._db = session.db
self._tables = session.db.tables
def __getattr__(self, key):
return SessionTable(session, key, self._tables[key])
def names(self):
return tuple(self._tables.keys())
class SessionTable(DotDict):
def __init__(self, session, name, columns):
super().__init__(columns)
self._name = name
self._session = session
self._db = session.db
@property
def name(self):
return self._name
@property
def columns(self):
return tuple(self.keys())
def fetch(self, **kwargs):
self._check_columns(**kwargs)
return self.session.fetch(self.name, **kwargs)
def insert(self, **kwargs):
self._check_columns(**kwargs)
return self.session.insert(self.name, **kwargs)
def remove(self, **kwargs):
self._check_columns(**kwargs)
return self.session.remove(self.name, **kwargs)
def _check_columns(self, **kwargs):
for key in kwargs.keys():
if key not in self.columns:
raise KeyError(f'Not a column for table "{self.name}": {key}')
class DbTables(DotDict):
def __init__(self, db):
super().__init__()
self._db = db
self._cfg = db.cfg
@property
def empty(self):
return not len(self.keys())
@property
def names(self):
return tuple(self.keys())
def load_tables(self, tables):
for name, columns in tables.items():
self.add_table(name, columns)
def unload_tables(self):
for key in self.names:
self.remove_table(name)
self._db.cache = None
def add_table(self, name, columns):
self[name] = {}
if type(columns) == list:
columns = {col.name: col for col in columns}
for column, data in columns.items():
self.add_column(name, column, data)
if not self._db.cache:
self._db.cache = DotDict({name: LruCache()})
else:
self._db.cache[name] = LruCache()
def remove_table(self, name):
del self._db.cache[name]
return self.pop(name)
def get_columns(self, name):
return tuple(self[name].values())
def add_column(self, table, name, data):
if type(data) == Column:
data = DbColumn.new_from_column(self._cfg.type, data)
elif type(data) == dict:
data = DbColumn(self._cfg.type, name, **data)
elif type(data) == DbColumn:
pass
else:
raise TypeError(f'Invalid column data type: {type(data).__name__}')
if not self.get(table):
self[table] = {}
self[table][name] = data
def compile_table(self, table_name, dbtype):
table = self[table_name]
columns = []
foreign_keys = []
for column in self.get_columns(table_name):
columns.append(column.compile(dbtype))
if column.foreign_key:
fkey_table, fkey_col = column.foreign_key.split('.', 1)
foreign_keys.append(f"FOREIGN KEY ('{column.name}') REFERENCES '{fkey_table}' ('{fkey_col}')")
if foreign_keys:
return f"CREATE TABLE IF NOT EXISTS '{table_name}' ({','.join(columns)}, {','.join(foreign_keys)})"
else:
return f"CREATE TABLE IF NOT EXISTS '{table_name}' ({','.join(columns)})"
def compile_all(self, dbtype):
return [self.compile_table(name, dbtype) for name in self.keys()]
class DbColumn(DotDict):
def __init__(self, dbtype, name, type=None, default=None, primary_key=False, unique=False, nullable=True, autoincrement=False, foreign_key=None):
super().__init__(
name = name,
type = type,
default = default,
primary_key = primary_key,
unique = unique,
nullable = nullable,
autoincrement = autoincrement,
foreign_key = foreign_key
)
self._dbtype = dbtype
if self.name == 'id':
if dbtype == 'sqlite':
self.type = 'INTEGER'
self.autoincrement = True
elif dbtype == 'postgresql':
self.type = 'SERIAL'
self.autoincrement = False
self.primary_key = True
self.unique = False
self.nullable = True
self.default = None
self.foreign_key = None
elif self.name in ['created', 'modified', 'accessed'] and not self.type:
self.type = 'DATETIME'
if not self.type:
raise ValueError(f'Must provide a column type for column: {name}')
try:
self.fkey
except ValueError:
raise ValueError(f'Invalid foreign_key format. Must be "table.column"')
@classmethod
def new_from_column(cls, dbtype, column):
return cls(dbtype, column.name,
type=column.type,
default=column.default,
primary_key=column.primary_key,
unique=column.unique,
nullable=column.nullable,
autoincrement=column.autoincrement,
foreign_key=column.foreign_key
)
@property
def fkey(self):
try:
return self.foreign_key.split('.')
except AttributeError:
return
def compile(self, *args):
line = f"'{self.name}' {self.type}"
if self.primary_key:
line += ' PRIMARY KEY'
if not self.nullable:
line += ' NOT NULL'
if self.unique:
line += ' UNIQUE'
if self.autoincrement and self._dbtype != 'postgresql':
line += ' AUTOINCREMENT'
if self.default:
line += f" DEFAULT {parse_default(self.default)}"
return line
class Column(DotDict):
def __init__(self, name, type=None, default=None, primary_key=False, unique=False, nullable=True, autoincrement=False, foreign_key=None):
super().__init__(
name = name,
type = type.upper() if type else None,
default = default,
primary_key = primary_key,
unique = unique,
nullable = nullable,
autoincrement = autoincrement,
foreign_key = foreign_key
)
if self.name == 'id':
self.type = 'SERIAL'
elif self.name in ['created', 'modified', 'accessed'] and not self.type:
self.type = 'DATETIME'
if not self.type:
raise ValueError(f'Must provide a column type for column: {name}')
try:
self.fkey
except ValueError:
raise ValueError(f'Invalid foreign_key format. Must be "table.column"')
@property
def fkey(self):
try:
return self.foreign_key.split('.')
except AttributeError:
return
def parse_default(default):
if isinstance(default, dict) or isinstance(default, list):
default = json.dumps(default)
if type(default) == str:
default = f"'{default}'"
return default

222
barkshark_sql/types.py Normal file
View file

@ -0,0 +1,222 @@
from datetime import date, time, datetime
from izzylib import (
DotDict,
LowerDotDict,
izzylog
)
Standard = {
'INTEGER',
'INT',
'TINYINT',
'SMALLINT',
'MEDIUMINT',
'BIGINT',
'UNSIGNED BIG INT',
'INT2',
'INT8',
'TEXT',
'CHARACTER',
'CHAR',
'VARCHAR',
'BLOB',
'CLOB',
'REAL',
'DOUBLE',
'DOUBLE PRECISION',
'FLOAT',
'NUMERIC',
'DEC',
'DECIMAL',
'BOOLEAN',
'DATE',
'TIME',
'JSON'
}
Sqlite = {
*Standard,
'DATETIME'
}
Postgresql = {
*Standard,
'SMALLSERIAL',
'SERIAL',
'BIGSERIAL',
'VARYING',
'BYTEA',
'TIMESTAMP',
'INTERVAL',
'POINT',
'LINE',
'LSEG',
'BOX',
'PATH',
'POLYGON',
'CIRCLE',
}
Mysql = {
*Standard,
'FIXED',
'BIT',
'YEAR',
'VARBINARY',
'ENUM',
'SET'
}
class Type:
sqlite = None
postgresql = None
mysql = None
def __getitem__(self, key):
if key in ['sqlite', 'postgresql', 'mysql']:
return getattr(self, key)
raise KeyError(f'Invalid database type: {key}')
def __call__(self, action, dbtype, value):
return getattr(self, action)(dbtype, value)
def name(self, dbtype='sqlite'):
return self[dbtype]
def serialize(self, dbtype, value):
return value
def deserialize(self, dbtype, value):
return value
class Json(Type):
sqlite = 'JSON'
postgresql = 'JSON'
mysql = 'JSON'
def serialize(self, dbtype, value):
izzylog.debug(f'serialize {type(self).__name__}: {type(value).__name__}', value)
if type(value) == str:
return DotDict(value)
return value
def deserialize(self, dbtype, value):
izzylog.debug(f'deserialize {type(self).__name__}: {type(value).__name__}', value)
return DotDict(value).to_json()
class Datetime(Type):
sqlite = 'DATETIME'
postgresql = 'TIMESTAMP'
mysql = 'DATETIME'
def serialize(self, dbtype, value):
izzylog.debug(f'serialize {type(self).__name__}: {type(value).__name__}', value)
if type(value) == str:
return datetime.fromisoformat(value)
elif type(value) == int:
return datetime.fromtimestamp(value)
return value
def deserialize(self, dbtype, value):
izzylog.debug(f'deserialize {type(self).__name__}: {type(value).__name__}', value)
if dbtype == 'sqlite':
return value.isoformat()
return value
class Date(Type):
sqlite = 'DATE'
postgresql = 'DATE'
mysql = 'DATE'
def serialize(self, dbtype, value):
izzylog.debug(f'serialize {type(self).__name__}: {type(value).__name__}', value)
if type(value) == str:
return date.fromisoformat(value)
elif type(value) == int:
return date.fromtimestamp(value)
return value
def deserialize(self, dbtype, value):
izzylog.debug(f'deserialize {type(self).__name__}: {type(value).__name__}', value)
if dbtype == 'sqlite':
return value.isoformat()
return value
class Time(Type):
sqlite = 'TIME'
postgresql = 'TIME'
mysql = 'TIME'
def serialize(self, dbtype, value):
izzylog.debug(f'serialize {type(self).__name__}: {type(value).__name__}', value)
if type(value) == str:
return time.fromisoformat(value)
elif type(value) == int:
return time.fromtimestamp(value)
return value
def deserialize(self, dbtype, value):
izzylog.debug(f'deserialize {type(self).__name__}: {type(value).__name__}', value)
if dbtype == 'sqlite':
return value.isoformat()
return value
class Types(DotDict):
def __init__(self, db):
self._db = db
self.set_type(Json, Date, Time, Datetime)
def get_type(self, name):
return self.get(name.upper(), Type())
def set_type(self, *types):
for type_object in types:
typeclass = type_object()
self[typeclass.name(self._db.cfg.type)] = typeclass

40
setup.cfg Normal file
View file

@ -0,0 +1,40 @@
[metadata]
name = Barkshark SQL
version = 0.1.0
author = Zoey Mae
author_email = zoey@barkshark.xyz
url = https://git.barkshark.xyz/izaliamae/barkshark-sql
description = Simple SQL client
license = CNPL 6+
license_file = LICENSE
platform = any
keywords = python database sql sqlite postgresql mysql
classifiers =
Development Status :: 3 - Alpha
Intended Audience :: Developers
Operating System :: OS Independent
Programming Language :: Python
Programming Language :: Python 3.6
Programming Language :: Python 3.7
Programming Language :: Python 3.8
Programming Language :: Python 3.9
Topic :: Software Development :: Libraries :: Python Modules
project_urls =
Bug Tracker = https://git.barkshark.xyz/izaliamae/barkshark-sql/issues
Documentation = https://git.barkshark.xyz/izaliamae/barkshark-sql/wiki
Source Code = https://git.barkshark.xyz/izaliamae/barkshark-sql
[options]
python_requires = >= 3.6
packages =
barkshark_sql
setup_requires =
setuptools >= 38.3.0
install_requires =
izzylib >= 0.7.0
[bdist_wheel]
universal = false
[sdist]
formats = zip, gztar

2
setup.py Normal file
View file

@ -0,0 +1,2 @@
import setuptools
setuptools.setup()