Repo created

This commit is contained in:
Fr4nz D13trich 2025-11-20 14:05:57 +01:00
parent 324070df30
commit 2d33a757bf
644 changed files with 99721 additions and 2 deletions

7
AUTHORS Normal file
View file

@ -0,0 +1,7 @@
You can view the list of people who have contributed to the code base in the version control history:
https://github.com/bitfireAT/davx5-ose/graphs/contributors
Translators are not mentioned in the history explicitly.
The list of translators can be found in the About screen.
Every contribution is welcome. There are many other forms of contributing besides writing code!

117
CONTRIBUTING.md Normal file
View file

@ -0,0 +1,117 @@
**Thank you for your interest in contributing to DAVx⁵!**
# Licensing
All work in this repository is [licensed under the GPLv3](LICENSE).
We (bitfire.at, initial and main contributors) are also asking you to give us
permission to use your contribution for related non-open source projects
like [Managed DAVx⁵](https://www.davx5.com/organizations/managed-davx5).
If you send us a pull request, our CLA bot will ask you to sign the
Contributor's License Agreement so that we can use your contribution.
# Copyright
Make sure that every file that contains significant work (at least every code file)
starts with the copyright header:
```
/*
* Copyright © All Contributors. See LICENSE and AUTHORS in the root directory for details.
*/
```
You can set this in Android Studio:
1. Settings / Editor / Copyright / Copyright Profiles
2. Paste the text above (without the stars).
3. Set Formatting so that the preview exactly looks like above; one blank line after the block.
4. Set this copyright profile as the default profile for the project.
5. Apply copyright: right-click in file tree / Update copyright.
# Style guide
Please adhere to the [Kotlin style guide](https://developer.android.com/kotlin/style-guide) and
the following hints to make the source code uniform.
**Have a look at similar files and copy their style if you're not certain.**
Sample file (pay attention to blank lines and other formatting):
```
<Copyright header, see above>
class MyClass(int arg1) : SuperClass() {
companion object {
const val CONSTANT_STRING = "Constant String";
fun staticMethod() { // Use static methods when you don't need the object context.
// …
}
}
var someProperty: String = "12345"
var someRelatedProperty: Int = 12345
init {
// constructor
}
/**
* Use KDoc to document important methods. Don't use it dogmatically, but writing proper documentation
* (not just the method name with spaces) helps you to re-think what the method shall really do.
*/
fun aFun1() { // Group methods by some logic (for instance, the order in which they will be called)
} // and alphabetically within a group.
fun anotherFun() {
// …
}
fun somethingCompletelyDifferent() { // two blank lines to separate groups
}
fun helperForSomethingCompletelyDifferent() {
someCall(arg1, arg2, arg3, arg4) // function calls: stick to one line unless it becomes confusing
}
class Model( // two blank lines before inner classes
someArgument: SomeLongClass, // arguments in multiple lines when they're too long for one line
anotherArgument: AnotherLongType,
thirdArgument: AnotherLongTypeName
) : ViewModel() {
fun abc() {
}
}
}
```
In general, use one blank line to separate things within one group of things, and two blank lines
to separate groups. In rare cases, when methods are tightly coupled and are only helpers for another
method, they may follow the calling method without separating blank lines.
## Tests
Test classes should be in the appropriate directory (see existing tests) and in the same package as the
tested class. Tests are usually be named like `methodToBeTested_Condition()`, see
[Test apps on Android](https://developer.android.com/training/testing/).
# Authors
If you make significant contributions, feel free to add yourself to the [AUTHORS file](AUTHORS).

674
LICENSE Normal file
View file

@ -0,0 +1,674 @@
GNU GENERAL PUBLIC LICENSE
Version 3, 29 June 2007
Copyright (C) 2007 Free Software Foundation, Inc. <http://fsf.org/>
Everyone is permitted to copy and distribute verbatim copies
of this license document, but changing it is not allowed.
Preamble
The GNU General Public License is a free, copyleft license for
software and other kinds of works.
The licenses for most software and other practical works are designed
to take away your freedom to share and change the works. By contrast,
the GNU General Public License is intended to guarantee your freedom to
share and change all versions of a program--to make sure it remains free
software for all its users. We, the Free Software Foundation, use the
GNU General Public License for most of our software; it applies also to
any other work released this way by its authors. You can apply it to
your programs, too.
When we speak of free software, we are referring to freedom, not
price. Our General Public Licenses are designed to make sure that you
have the freedom to distribute copies of free software (and charge for
them if you wish), that you receive source code or can get it if you
want it, that you can change the software or use pieces of it in new
free programs, and that you know you can do these things.
To protect your rights, we need to prevent others from denying you
these rights or asking you to surrender the rights. Therefore, you have
certain responsibilities if you distribute copies of the software, or if
you modify it: responsibilities to respect the freedom of others.
For example, if you distribute copies of such a program, whether
gratis or for a fee, you must pass on to the recipients the same
freedoms that you received. You must make sure that they, too, receive
or can get the source code. And you must show them these terms so they
know their rights.
Developers that use the GNU GPL protect your rights with two steps:
(1) assert copyright on the software, and (2) offer you this License
giving you legal permission to copy, distribute and/or modify it.
For the developers' and authors' protection, the GPL clearly explains
that there is no warranty for this free software. For both users' and
authors' sake, the GPL requires that modified versions be marked as
changed, so that their problems will not be attributed erroneously to
authors of previous versions.
Some devices are designed to deny users access to install or run
modified versions of the software inside them, although the manufacturer
can do so. This is fundamentally incompatible with the aim of
protecting users' freedom to change the software. The systematic
pattern of such abuse occurs in the area of products for individuals to
use, which is precisely where it is most unacceptable. Therefore, we
have designed this version of the GPL to prohibit the practice for those
products. If such problems arise substantially in other domains, we
stand ready to extend this provision to those domains in future versions
of the GPL, as needed to protect the freedom of users.
Finally, every program is threatened constantly by software patents.
States should not allow patents to restrict development and use of
software on general-purpose computers, but in those that do, we wish to
avoid the special danger that patents applied to a free program could
make it effectively proprietary. To prevent this, the GPL assures that
patents cannot be used to render the program non-free.
The precise terms and conditions for copying, distribution and
modification follow.
TERMS AND CONDITIONS
0. Definitions.
"This License" refers to version 3 of the GNU General Public License.
"Copyright" also means copyright-like laws that apply to other kinds of
works, such as semiconductor masks.
"The Program" refers to any copyrightable work licensed under this
License. Each licensee is addressed as "you". "Licensees" and
"recipients" may be individuals or organizations.
To "modify" a work means to copy from or adapt all or part of the work
in a fashion requiring copyright permission, other than the making of an
exact copy. The resulting work is called a "modified version" of the
earlier work or a work "based on" the earlier work.
A "covered work" means either the unmodified Program or a work based
on the Program.
To "propagate" a work means to do anything with it that, without
permission, would make you directly or secondarily liable for
infringement under applicable copyright law, except executing it on a
computer or modifying a private copy. Propagation includes copying,
distribution (with or without modification), making available to the
public, and in some countries other activities as well.
To "convey" a work means any kind of propagation that enables other
parties to make or receive copies. Mere interaction with a user through
a computer network, with no transfer of a copy, is not conveying.
An interactive user interface displays "Appropriate Legal Notices"
to the extent that it includes a convenient and prominently visible
feature that (1) displays an appropriate copyright notice, and (2)
tells the user that there is no warranty for the work (except to the
extent that warranties are provided), that licensees may convey the
work under this License, and how to view a copy of this License. If
the interface presents a list of user commands or options, such as a
menu, a prominent item in the list meets this criterion.
1. Source Code.
The "source code" for a work means the preferred form of the work
for making modifications to it. "Object code" means any non-source
form of a work.
A "Standard Interface" means an interface that either is an official
standard defined by a recognized standards body, or, in the case of
interfaces specified for a particular programming language, one that
is widely used among developers working in that language.
The "System Libraries" of an executable work include anything, other
than the work as a whole, that (a) is included in the normal form of
packaging a Major Component, but which is not part of that Major
Component, and (b) serves only to enable use of the work with that
Major Component, or to implement a Standard Interface for which an
implementation is available to the public in source code form. A
"Major Component", in this context, means a major essential component
(kernel, window system, and so on) of the specific operating system
(if any) on which the executable work runs, or a compiler used to
produce the work, or an object code interpreter used to run it.
The "Corresponding Source" for a work in object code form means all
the source code needed to generate, install, and (for an executable
work) run the object code and to modify the work, including scripts to
control those activities. However, it does not include the work's
System Libraries, or general-purpose tools or generally available free
programs which are used unmodified in performing those activities but
which are not part of the work. For example, Corresponding Source
includes interface definition files associated with source files for
the work, and the source code for shared libraries and dynamically
linked subprograms that the work is specifically designed to require,
such as by intimate data communication or control flow between those
subprograms and other parts of the work.
The Corresponding Source need not include anything that users
can regenerate automatically from other parts of the Corresponding
Source.
The Corresponding Source for a work in source code form is that
same work.
2. Basic Permissions.
All rights granted under this License are granted for the term of
copyright on the Program, and are irrevocable provided the stated
conditions are met. This License explicitly affirms your unlimited
permission to run the unmodified Program. The output from running a
covered work is covered by this License only if the output, given its
content, constitutes a covered work. This License acknowledges your
rights of fair use or other equivalent, as provided by copyright law.
You may make, run and propagate covered works that you do not
convey, without conditions so long as your license otherwise remains
in force. You may convey covered works to others for the sole purpose
of having them make modifications exclusively for you, or provide you
with facilities for running those works, provided that you comply with
the terms of this License in conveying all material for which you do
not control copyright. Those thus making or running the covered works
for you must do so exclusively on your behalf, under your direction
and control, on terms that prohibit them from making any copies of
your copyrighted material outside their relationship with you.
Conveying under any other circumstances is permitted solely under
the conditions stated below. Sublicensing is not allowed; section 10
makes it unnecessary.
3. Protecting Users' Legal Rights From Anti-Circumvention Law.
No covered work shall be deemed part of an effective technological
measure under any applicable law fulfilling obligations under article
11 of the WIPO copyright treaty adopted on 20 December 1996, or
similar laws prohibiting or restricting circumvention of such
measures.
When you convey a covered work, you waive any legal power to forbid
circumvention of technological measures to the extent such circumvention
is effected by exercising rights under this License with respect to
the covered work, and you disclaim any intention to limit operation or
modification of the work as a means of enforcing, against the work's
users, your or third parties' legal rights to forbid circumvention of
technological measures.
4. Conveying Verbatim Copies.
You may convey verbatim copies of the Program's source code as you
receive it, in any medium, provided that you conspicuously and
appropriately publish on each copy an appropriate copyright notice;
keep intact all notices stating that this License and any
non-permissive terms added in accord with section 7 apply to the code;
keep intact all notices of the absence of any warranty; and give all
recipients a copy of this License along with the Program.
You may charge any price or no price for each copy that you convey,
and you may offer support or warranty protection for a fee.
5. Conveying Modified Source Versions.
You may convey a work based on the Program, or the modifications to
produce it from the Program, in the form of source code under the
terms of section 4, provided that you also meet all of these conditions:
a) The work must carry prominent notices stating that you modified
it, and giving a relevant date.
b) The work must carry prominent notices stating that it is
released under this License and any conditions added under section
7. This requirement modifies the requirement in section 4 to
"keep intact all notices".
c) You must license the entire work, as a whole, under this
License to anyone who comes into possession of a copy. This
License will therefore apply, along with any applicable section 7
additional terms, to the whole of the work, and all its parts,
regardless of how they are packaged. This License gives no
permission to license the work in any other way, but it does not
invalidate such permission if you have separately received it.
d) If the work has interactive user interfaces, each must display
Appropriate Legal Notices; however, if the Program has interactive
interfaces that do not display Appropriate Legal Notices, your
work need not make them do so.
A compilation of a covered work with other separate and independent
works, which are not by their nature extensions of the covered work,
and which are not combined with it such as to form a larger program,
in or on a volume of a storage or distribution medium, is called an
"aggregate" if the compilation and its resulting copyright are not
used to limit the access or legal rights of the compilation's users
beyond what the individual works permit. Inclusion of a covered work
in an aggregate does not cause this License to apply to the other
parts of the aggregate.
6. Conveying Non-Source Forms.
You may convey a covered work in object code form under the terms
of sections 4 and 5, provided that you also convey the
machine-readable Corresponding Source under the terms of this License,
in one of these ways:
a) Convey the object code in, or embodied in, a physical product
(including a physical distribution medium), accompanied by the
Corresponding Source fixed on a durable physical medium
customarily used for software interchange.
b) Convey the object code in, or embodied in, a physical product
(including a physical distribution medium), accompanied by a
written offer, valid for at least three years and valid for as
long as you offer spare parts or customer support for that product
model, to give anyone who possesses the object code either (1) a
copy of the Corresponding Source for all the software in the
product that is covered by this License, on a durable physical
medium customarily used for software interchange, for a price no
more than your reasonable cost of physically performing this
conveying of source, or (2) access to copy the
Corresponding Source from a network server at no charge.
c) Convey individual copies of the object code with a copy of the
written offer to provide the Corresponding Source. This
alternative is allowed only occasionally and noncommercially, and
only if you received the object code with such an offer, in accord
with subsection 6b.
d) Convey the object code by offering access from a designated
place (gratis or for a charge), and offer equivalent access to the
Corresponding Source in the same way through the same place at no
further charge. You need not require recipients to copy the
Corresponding Source along with the object code. If the place to
copy the object code is a network server, the Corresponding Source
may be on a different server (operated by you or a third party)
that supports equivalent copying facilities, provided you maintain
clear directions next to the object code saying where to find the
Corresponding Source. Regardless of what server hosts the
Corresponding Source, you remain obligated to ensure that it is
available for as long as needed to satisfy these requirements.
e) Convey the object code using peer-to-peer transmission, provided
you inform other peers where the object code and Corresponding
Source of the work are being offered to the general public at no
charge under subsection 6d.
A separable portion of the object code, whose source code is excluded
from the Corresponding Source as a System Library, need not be
included in conveying the object code work.
A "User Product" is either (1) a "consumer product", which means any
tangible personal property which is normally used for personal, family,
or household purposes, or (2) anything designed or sold for incorporation
into a dwelling. In determining whether a product is a consumer product,
doubtful cases shall be resolved in favor of coverage. For a particular
product received by a particular user, "normally used" refers to a
typical or common use of that class of product, regardless of the status
of the particular user or of the way in which the particular user
actually uses, or expects or is expected to use, the product. A product
is a consumer product regardless of whether the product has substantial
commercial, industrial or non-consumer uses, unless such uses represent
the only significant mode of use of the product.
"Installation Information" for a User Product means any methods,
procedures, authorization keys, or other information required to install
and execute modified versions of a covered work in that User Product from
a modified version of its Corresponding Source. The information must
suffice to ensure that the continued functioning of the modified object
code is in no case prevented or interfered with solely because
modification has been made.
If you convey an object code work under this section in, or with, or
specifically for use in, a User Product, and the conveying occurs as
part of a transaction in which the right of possession and use of the
User Product is transferred to the recipient in perpetuity or for a
fixed term (regardless of how the transaction is characterized), the
Corresponding Source conveyed under this section must be accompanied
by the Installation Information. But this requirement does not apply
if neither you nor any third party retains the ability to install
modified object code on the User Product (for example, the work has
been installed in ROM).
The requirement to provide Installation Information does not include a
requirement to continue to provide support service, warranty, or updates
for a work that has been modified or installed by the recipient, or for
the User Product in which it has been modified or installed. Access to a
network may be denied when the modification itself materially and
adversely affects the operation of the network or violates the rules and
protocols for communication across the network.
Corresponding Source conveyed, and Installation Information provided,
in accord with this section must be in a format that is publicly
documented (and with an implementation available to the public in
source code form), and must require no special password or key for
unpacking, reading or copying.
7. Additional Terms.
"Additional permissions" are terms that supplement the terms of this
License by making exceptions from one or more of its conditions.
Additional permissions that are applicable to the entire Program shall
be treated as though they were included in this License, to the extent
that they are valid under applicable law. If additional permissions
apply only to part of the Program, that part may be used separately
under those permissions, but the entire Program remains governed by
this License without regard to the additional permissions.
When you convey a copy of a covered work, you may at your option
remove any additional permissions from that copy, or from any part of
it. (Additional permissions may be written to require their own
removal in certain cases when you modify the work.) You may place
additional permissions on material, added by you to a covered work,
for which you have or can give appropriate copyright permission.
Notwithstanding any other provision of this License, for material you
add to a covered work, you may (if authorized by the copyright holders of
that material) supplement the terms of this License with terms:
a) Disclaiming warranty or limiting liability differently from the
terms of sections 15 and 16 of this License; or
b) Requiring preservation of specified reasonable legal notices or
author attributions in that material or in the Appropriate Legal
Notices displayed by works containing it; or
c) Prohibiting misrepresentation of the origin of that material, or
requiring that modified versions of such material be marked in
reasonable ways as different from the original version; or
d) Limiting the use for publicity purposes of names of licensors or
authors of the material; or
e) Declining to grant rights under trademark law for use of some
trade names, trademarks, or service marks; or
f) Requiring indemnification of licensors and authors of that
material by anyone who conveys the material (or modified versions of
it) with contractual assumptions of liability to the recipient, for
any liability that these contractual assumptions directly impose on
those licensors and authors.
All other non-permissive additional terms are considered "further
restrictions" within the meaning of section 10. If the Program as you
received it, or any part of it, contains a notice stating that it is
governed by this License along with a term that is a further
restriction, you may remove that term. If a license document contains
a further restriction but permits relicensing or conveying under this
License, you may add to a covered work material governed by the terms
of that license document, provided that the further restriction does
not survive such relicensing or conveying.
If you add terms to a covered work in accord with this section, you
must place, in the relevant source files, a statement of the
additional terms that apply to those files, or a notice indicating
where to find the applicable terms.
Additional terms, permissive or non-permissive, may be stated in the
form of a separately written license, or stated as exceptions;
the above requirements apply either way.
8. Termination.
You may not propagate or modify a covered work except as expressly
provided under this License. Any attempt otherwise to propagate or
modify it is void, and will automatically terminate your rights under
this License (including any patent licenses granted under the third
paragraph of section 11).
However, if you cease all violation of this License, then your
license from a particular copyright holder is reinstated (a)
provisionally, unless and until the copyright holder explicitly and
finally terminates your license, and (b) permanently, if the copyright
holder fails to notify you of the violation by some reasonable means
prior to 60 days after the cessation.
Moreover, your license from a particular copyright holder is
reinstated permanently if the copyright holder notifies you of the
violation by some reasonable means, this is the first time you have
received notice of violation of this License (for any work) from that
copyright holder, and you cure the violation prior to 30 days after
your receipt of the notice.
Termination of your rights under this section does not terminate the
licenses of parties who have received copies or rights from you under
this License. If your rights have been terminated and not permanently
reinstated, you do not qualify to receive new licenses for the same
material under section 10.
9. Acceptance Not Required for Having Copies.
You are not required to accept this License in order to receive or
run a copy of the Program. Ancillary propagation of a covered work
occurring solely as a consequence of using peer-to-peer transmission
to receive a copy likewise does not require acceptance. However,
nothing other than this License grants you permission to propagate or
modify any covered work. These actions infringe copyright if you do
not accept this License. Therefore, by modifying or propagating a
covered work, you indicate your acceptance of this License to do so.
10. Automatic Licensing of Downstream Recipients.
Each time you convey a covered work, the recipient automatically
receives a license from the original licensors, to run, modify and
propagate that work, subject to this License. You are not responsible
for enforcing compliance by third parties with this License.
An "entity transaction" is a transaction transferring control of an
organization, or substantially all assets of one, or subdividing an
organization, or merging organizations. If propagation of a covered
work results from an entity transaction, each party to that
transaction who receives a copy of the work also receives whatever
licenses to the work the party's predecessor in interest had or could
give under the previous paragraph, plus a right to possession of the
Corresponding Source of the work from the predecessor in interest, if
the predecessor has it or can get it with reasonable efforts.
You may not impose any further restrictions on the exercise of the
rights granted or affirmed under this License. For example, you may
not impose a license fee, royalty, or other charge for exercise of
rights granted under this License, and you may not initiate litigation
(including a cross-claim or counterclaim in a lawsuit) alleging that
any patent claim is infringed by making, using, selling, offering for
sale, or importing the Program or any portion of it.
11. Patents.
A "contributor" is a copyright holder who authorizes use under this
License of the Program or a work on which the Program is based. The
work thus licensed is called the contributor's "contributor version".
A contributor's "essential patent claims" are all patent claims
owned or controlled by the contributor, whether already acquired or
hereafter acquired, that would be infringed by some manner, permitted
by this License, of making, using, or selling its contributor version,
but do not include claims that would be infringed only as a
consequence of further modification of the contributor version. For
purposes of this definition, "control" includes the right to grant
patent sublicenses in a manner consistent with the requirements of
this License.
Each contributor grants you a non-exclusive, worldwide, royalty-free
patent license under the contributor's essential patent claims, to
make, use, sell, offer for sale, import and otherwise run, modify and
propagate the contents of its contributor version.
In the following three paragraphs, a "patent license" is any express
agreement or commitment, however denominated, not to enforce a patent
(such as an express permission to practice a patent or covenant not to
sue for patent infringement). To "grant" such a patent license to a
party means to make such an agreement or commitment not to enforce a
patent against the party.
If you convey a covered work, knowingly relying on a patent license,
and the Corresponding Source of the work is not available for anyone
to copy, free of charge and under the terms of this License, through a
publicly available network server or other readily accessible means,
then you must either (1) cause the Corresponding Source to be so
available, or (2) arrange to deprive yourself of the benefit of the
patent license for this particular work, or (3) arrange, in a manner
consistent with the requirements of this License, to extend the patent
license to downstream recipients. "Knowingly relying" means you have
actual knowledge that, but for the patent license, your conveying the
covered work in a country, or your recipient's use of the covered work
in a country, would infringe one or more identifiable patents in that
country that you have reason to believe are valid.
If, pursuant to or in connection with a single transaction or
arrangement, you convey, or propagate by procuring conveyance of, a
covered work, and grant a patent license to some of the parties
receiving the covered work authorizing them to use, propagate, modify
or convey a specific copy of the covered work, then the patent license
you grant is automatically extended to all recipients of the covered
work and works based on it.
A patent license is "discriminatory" if it does not include within
the scope of its coverage, prohibits the exercise of, or is
conditioned on the non-exercise of one or more of the rights that are
specifically granted under this License. You may not convey a covered
work if you are a party to an arrangement with a third party that is
in the business of distributing software, under which you make payment
to the third party based on the extent of your activity of conveying
the work, and under which the third party grants, to any of the
parties who would receive the covered work from you, a discriminatory
patent license (a) in connection with copies of the covered work
conveyed by you (or copies made from those copies), or (b) primarily
for and in connection with specific products or compilations that
contain the covered work, unless you entered into that arrangement,
or that patent license was granted, prior to 28 March 2007.
Nothing in this License shall be construed as excluding or limiting
any implied license or other defenses to infringement that may
otherwise be available to you under applicable patent law.
12. No Surrender of Others' Freedom.
If conditions are imposed on you (whether by court order, agreement or
otherwise) that contradict the conditions of this License, they do not
excuse you from the conditions of this License. If you cannot convey a
covered work so as to satisfy simultaneously your obligations under this
License and any other pertinent obligations, then as a consequence you may
not convey it at all. For example, if you agree to terms that obligate you
to collect a royalty for further conveying from those to whom you convey
the Program, the only way you could satisfy both those terms and this
License would be to refrain entirely from conveying the Program.
13. Use with the GNU Affero General Public License.
Notwithstanding any other provision of this License, you have
permission to link or combine any covered work with a work licensed
under version 3 of the GNU Affero General Public License into a single
combined work, and to convey the resulting work. The terms of this
License will continue to apply to the part which is the covered work,
but the special requirements of the GNU Affero General Public License,
section 13, concerning interaction through a network will apply to the
combination as such.
14. Revised Versions of this License.
The Free Software Foundation may publish revised and/or new versions of
the GNU General Public License from time to time. Such new versions will
be similar in spirit to the present version, but may differ in detail to
address new problems or concerns.
Each version is given a distinguishing version number. If the
Program specifies that a certain numbered version of the GNU General
Public License "or any later version" applies to it, you have the
option of following the terms and conditions either of that numbered
version or of any later version published by the Free Software
Foundation. If the Program does not specify a version number of the
GNU General Public License, you may choose any version ever published
by the Free Software Foundation.
If the Program specifies that a proxy can decide which future
versions of the GNU General Public License can be used, that proxy's
public statement of acceptance of a version permanently authorizes you
to choose that version for the Program.
Later license versions may give you additional or different
permissions. However, no additional obligations are imposed on any
author or copyright holder as a result of your choosing to follow a
later version.
15. Disclaimer of Warranty.
THERE IS NO WARRANTY FOR THE PROGRAM, TO THE EXTENT PERMITTED BY
APPLICABLE LAW. EXCEPT WHEN OTHERWISE STATED IN WRITING THE COPYRIGHT
HOLDERS AND/OR OTHER PARTIES PROVIDE THE PROGRAM "AS IS" WITHOUT WARRANTY
OF ANY KIND, EITHER EXPRESSED OR IMPLIED, INCLUDING, BUT NOT LIMITED TO,
THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR
PURPOSE. THE ENTIRE RISK AS TO THE QUALITY AND PERFORMANCE OF THE PROGRAM
IS WITH YOU. SHOULD THE PROGRAM PROVE DEFECTIVE, YOU ASSUME THE COST OF
ALL NECESSARY SERVICING, REPAIR OR CORRECTION.
16. Limitation of Liability.
IN NO EVENT UNLESS REQUIRED BY APPLICABLE LAW OR AGREED TO IN WRITING
WILL ANY COPYRIGHT HOLDER, OR ANY OTHER PARTY WHO MODIFIES AND/OR CONVEYS
THE PROGRAM AS PERMITTED ABOVE, BE LIABLE TO YOU FOR DAMAGES, INCLUDING ANY
GENERAL, SPECIAL, INCIDENTAL OR CONSEQUENTIAL DAMAGES ARISING OUT OF THE
USE OR INABILITY TO USE THE PROGRAM (INCLUDING BUT NOT LIMITED TO LOSS OF
DATA OR DATA BEING RENDERED INACCURATE OR LOSSES SUSTAINED BY YOU OR THIRD
PARTIES OR A FAILURE OF THE PROGRAM TO OPERATE WITH ANY OTHER PROGRAMS),
EVEN IF SUCH HOLDER OR OTHER PARTY HAS BEEN ADVISED OF THE POSSIBILITY OF
SUCH DAMAGES.
17. Interpretation of Sections 15 and 16.
If the disclaimer of warranty and limitation of liability provided
above cannot be given local legal effect according to their terms,
reviewing courts shall apply local law that most closely approximates
an absolute waiver of all civil liability in connection with the
Program, unless a warranty or assumption of liability accompanies a
copy of the Program in return for a fee.
END OF TERMS AND CONDITIONS
How to Apply These Terms to Your New Programs
If you develop a new program, and you want it to be of the greatest
possible use to the public, the best way to achieve this is to make it
free software which everyone can redistribute and change under these terms.
To do so, attach the following notices to the program. It is safest
to attach them to the start of each source file to most effectively
state the exclusion of warranty; and each file should have at least
the "copyright" line and a pointer to where the full notice is found.
<one line to give the program's name and a brief idea of what it does.>
Copyright (C) <year> <name of author>
This program is free software: you can redistribute it and/or modify
it under the terms of the GNU General Public License as published by
the Free Software Foundation, either version 3 of the License, or
(at your option) any later version.
This program is distributed in the hope that it will be useful,
but WITHOUT ANY WARRANTY; without even the implied warranty of
MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
GNU General Public License for more details.
You should have received a copy of the GNU General Public License
along with this program. If not, see <http://www.gnu.org/licenses/>.
Also add information on how to contact you by electronic and paper mail.
If the program does terminal interaction, make it output a short
notice like this when it starts in an interactive mode:
<program> Copyright (C) <year> <name of author>
This program comes with ABSOLUTELY NO WARRANTY; for details type `show w'.
This is free software, and you are welcome to redistribute it
under certain conditions; type `show c' for details.
The hypothetical commands `show w' and `show c' should show the appropriate
parts of the General Public License. Of course, your program's commands
might be different; for a GUI interface, you would use an "about box".
You should also get your employer (if you work as a programmer) or school,
if any, to sign a "copyright disclaimer" for the program, if necessary.
For more information on this, and how to apply and follow the GNU GPL, see
<http://www.gnu.org/licenses/>.
The GNU General Public License does not permit incorporating your program
into proprietary programs. If your program is a subroutine library, you
may consider it more useful to permit linking proprietary applications with
the library. If this is what you want to do, use the GNU Lesser General
Public License instead of this License. But first, please read
<http://www.gnu.org/philosophy/why-not-lgpl.html>.

View file

@ -1,3 +1,45 @@
# davx5
Contact and Calendar Synchronisation Android
[![Website](https://img.shields.io/website?style=flat-square&up_color=%237cb342&url=https%3A%2F%2Fwww.davx5.com)](https://www.davx5.com/)
[![F-Droid](https://img.shields.io/f-droid/v/at.bitfire.davdroid?style=flat-square)](https://f-droid.org/packages/at.bitfire.davdroid/)
[![License](https://img.shields.io/github/license/bitfireAT/davx5-ose?style=flat-square)](https://github.com/bitfireAT/davx5-ose/blob/main/LICENSE)
[![Follow @davx5app@fosstodon.org](https://img.shields.io/mastodon/follow/109598783742737223?domain=https%3A%2F%2Ffosstodon.org&style=flat-square)](https://fosstodon.org/@davx5app)
[![Development tests](https://github.com/bitfireAT/davx5-ose/actions/workflows/test-dev.yml/badge.svg)](https://github.com/bitfireAT/davx5-ose/actions/workflows/test-dev.yml)
![DAVx⁵ logo](app/src/main/res/mipmap-xxxhdpi/ic_launcher.png)
DAVx⁵
========
Please see the [DAVx⁵ Web site](https://www.davx5.com) for
comprehensive information about DAVx⁵, including a list of services it has been tested with.
DAVx⁵ is licensed under the [GPLv3 License](LICENSE).
News and updates:
* [@davx5app@fosstodon.org](https://fosstodon.org/@davx5app) on Mastodon
**Help, feature requests, bug reports: [DAVx⁵ discussions](https://github.com/bitfireAT/davx5-ose/discussions)**
Parts of DAVx⁵ have been outsourced into these libraries:
* [cert4android](https://github.com/bitfireAT/cert4android) custom certificate management
* [dav4jvm](https://github.com/bitfireAT/dav4jvm) WebDAV/CalDav/CardDAV framework
* [synctools](https://github.com/bitfireAT/synctools) iCalendar/vCard/Tasks processing and content provider access
**If you want to support DAVx⁵, please consider [donating to DAVx⁵](https://www.davx5.com/donate)
or [purchasing it](https://www.davx5.com/download).**
USED THIRD-PARTY LIBRARIES
==========================
The most important libraries which are used by DAVx⁵ (alphabetically):
* [dnsjava](https://github.com/dnsjava/dnsjava) [BSD License](https://github.com/dnsjava/dnsjava/blob/master/LICENSE)
* [ez-vcard](https://github.com/mangstadt/ez-vcard) [New BSD License](https://github.com/mangstadt/ez-vcard/blob/master/LICENSE)
* [iCal4j](https://github.com/ical4j/ical4j) [New BSD License](https://github.com/ical4j/ical4j/blob/develop/LICENSE.txt)
* [okhttp](https://square.github.io/okhttp) [Apache License, Version 2.0](https://square.github.io/okhttp/#license)
See _About / Libraries_ in the app for all used libraries and their licenses.

5
SECURITY.md Normal file
View file

@ -0,0 +1,5 @@
# Security Policy
## Reporting a Vulnerability
Please report security vulnerabilities using our [secure support form](https://www.davx5.com/support) or via email to support-en@davx5.com.

2
app/.gitignore vendored Normal file
View file

@ -0,0 +1,2 @@
build
target

227
app/build.gradle.kts Normal file
View file

@ -0,0 +1,227 @@
/*
* Copyright © All Contributors. See LICENSE and AUTHORS in the root directory for details.
*/
plugins {
alias(libs.plugins.android.application)
alias(libs.plugins.compose.compiler)
alias(libs.plugins.hilt)
alias(libs.plugins.kotlin.android)
alias(libs.plugins.ksp)
alias(libs.plugins.mikepenz.aboutLibraries.android)
}
// Android configuration
android {
compileSdk = 36
defaultConfig {
applicationId = "at.bitfire.davdroid"
versionCode = 405050004
versionName = "4.5.5"
base.archivesName = "davx5-ose-$versionName"
minSdk = 24 // Android 7.0
targetSdk = 36 // Android 16
buildConfigField("boolean", "customCertsUI", "true")
testInstrumentationRunner = "at.bitfire.davdroid.HiltTestRunner"
}
java {
toolchain {
languageVersion = JavaLanguageVersion.of(21)
}
}
compileOptions {
// required for
// - dnsjava 3.x: java.nio.file.Path
// - ical4android: time API
isCoreLibraryDesugaringEnabled = true
}
buildFeatures {
buildConfig = true
compose = true
}
// Java namespace for our classes (not to be confused with Android package ID)
namespace = "at.bitfire.davdroid"
flavorDimensions += "distribution"
productFlavors {
create("ose") {
dimension = "distribution"
versionNameSuffix = "-ose"
}
}
sourceSets {
getByName("androidTest") {
assets.srcDir("$projectDir/schemas")
}
}
signingConfigs {
create("bitfire") {
storeFile = file(System.getenv("ANDROID_KEYSTORE") ?: "/dev/null")
storePassword = System.getenv("ANDROID_KEYSTORE_PASSWORD")
keyAlias = System.getenv("ANDROID_KEY_ALIAS")
keyPassword = System.getenv("ANDROID_KEY_PASSWORD")
}
}
buildTypes {
getByName("release") {
isMinifyEnabled = true
proguardFiles(getDefaultProguardFile("proguard-android-optimize.txt"), "proguard-rules-release.pro")
isShrinkResources = true
signingConfig = signingConfigs.findByName("bitfire")
}
}
lint {
disable += arrayOf("GoogleAppIndexingWarning", "ImpliedQuantity", "MissingQuantity", "MissingTranslation", "ExtraTranslation", "RtlEnabled", "RtlHardcoded", "Typos")
}
androidResources {
generateLocaleConfig = true
}
packaging {
resources {
// multiple (test) dependencies have LICENSE files at same location
merges += arrayOf("META-INF/LICENSE*")
}
}
@Suppress("UnstableApiUsage")
testOptions {
managedDevices {
localDevices {
create("virtual") {
device = "Pixel 3"
// TBD: API level 35 and higher causes network tests to fail sometimes, see https://github.com/bitfireAT/davx5-ose/issues/1525
// Suspected reason: https://developer.android.com/about/versions/15/behavior-changes-all#background-network-access
apiLevel = 34
systemImageSource = "aosp-atd"
}
}
}
}
}
ksp {
arg("room.schemaLocation", "$projectDir/schemas")
}
aboutLibraries {
export {
// exclude timestamps for reproducible builds [https://github.com/bitfireAT/davx5-ose/issues/994]
excludeFields.add("generated")
}
}
dependencies {
// core
implementation(libs.kotlin.stdlib)
implementation(libs.kotlinx.coroutines)
coreLibraryDesugaring(libs.android.desugaring)
// Hilt
implementation(libs.hilt.android.base)
ksp(libs.androidx.hilt.compiler)
ksp(libs.hilt.android.compiler)
// support libs
implementation(libs.androidx.activityCompose)
implementation(libs.androidx.appcompat)
implementation(libs.androidx.browser)
implementation(libs.androidx.core)
implementation(libs.androidx.hilt.navigation.compose)
implementation(libs.androidx.hilt.work)
implementation(libs.androidx.lifecycle.runtime.compose)
implementation(libs.androidx.lifecycle.viewmodel.base)
implementation(libs.androidx.lifecycle.viewmodel.compose)
implementation(libs.androidx.paging)
implementation(libs.androidx.paging.compose)
implementation(libs.androidx.preference)
implementation(libs.androidx.security)
implementation(libs.androidx.work.base)
// Jetpack Compose
implementation(libs.compose.accompanist.permissions)
implementation(platform(libs.compose.bom))
implementation(libs.compose.material3)
implementation(libs.compose.materialIconsExtended)
debugImplementation(libs.compose.ui.tooling)
implementation(libs.compose.ui.toolingPreview)
// Glance Widgets
implementation(libs.glance.base)
implementation(libs.glance.material)
// Jetpack Room
implementation(libs.room.runtime)
implementation(libs.room.base)
implementation(libs.room.paging)
ksp(libs.room.compiler)
// own libraries
implementation(libs.bitfire.cert4android)
implementation(libs.bitfire.dav4jvm) {
exclude(group="junit")
exclude(group="org.ogce", module="xpp3") // Android has its own XmlPullParser implementation
}
implementation(libs.bitfire.synctools) {
exclude(group="androidx.test") // synctools declares test rules, but we don't want them in non-test code
exclude(group = "junit")
}
// third-party libs
@Suppress("RedundantSuppression")
implementation(libs.dnsjava)
implementation(libs.guava)
implementation(libs.mikepenz.aboutLibraries.m3)
implementation(libs.okhttp.base)
implementation(libs.okhttp.brotli)
implementation(libs.okhttp.logging)
implementation(libs.openid.appauth)
implementation(libs.unifiedpush) {
// UnifiedPush connector seems to be using a workaround by importing this library.
// Will be removed after https://github.com/tink-crypto/tink-java-apps/pull/5 is merged.
// See: https://codeberg.org/UnifiedPush/android-connector/src/commit/28cb0d622ed0a972996041ab9cc85b701abc48c6/connector/build.gradle#L56-L59
exclude(group = "com.google.crypto.tink", module = "tink")
}
implementation(libs.unifiedpush.fcm)
// force some versions for compatibility with our minSdk level (see version catalog for details)
implementation(libs.commons.codec)
implementation(libs.commons.lang)
// for tests
androidTestImplementation(libs.androidx.arch.core.testing)
androidTestImplementation(libs.androidx.test.core)
androidTestImplementation(libs.androidx.test.junit)
androidTestImplementation(libs.androidx.test.rules)
androidTestImplementation(libs.androidx.test.runner)
androidTestImplementation(libs.androidx.work.testing)
androidTestImplementation(libs.hilt.android.testing)
androidTestImplementation(libs.junit)
androidTestImplementation(libs.kotlinx.coroutines.test)
androidTestImplementation(libs.mockk.android)
androidTestImplementation(libs.okhttp.mockwebserver)
androidTestImplementation(libs.room.testing)
testImplementation(libs.bitfire.dav4jvm)
testImplementation(libs.junit)
testImplementation(libs.mockk)
testImplementation(libs.okhttp.mockwebserver)
}

13
app/lint.xml Normal file
View file

@ -0,0 +1,13 @@
<?xml version="1.0" encoding="UTF-8"?>
<!--
~ Copyright (c) 2013 2015 Ricki Hirner (bitfire web engineering).
~ All rights reserved. This program and the accompanying materials
~ are made available under the terms of the GNU Public License v3.0
~ which accompanies this distribution, and is available at
~ http://www.gnu.org/licenses/gpl.html
-->
<lint>
<issue id="InvalidPackage" severity="ignore" />
<issue id="MissingTranslation" severity="warning" />
</lint>

View file

@ -0,0 +1,31 @@
# R8 usage for DAVx⁵:
# shrinking yes (only in release builds)
# optimization yes (on by R8 defaults)
# full-mode no (see gradle.properties)
# obfuscation no (open-source)
-dontobfuscate
-printusage build/reports/r8-usage.txt
# keep rules
-keep class at.bitfire.** { *; } # all DAVx5 code is required
-keep class org.xmlpull.** { *; }
# Additional rules which are now required since missing classes can't be ignored in R8 anymore.
# [https://developer.android.com/build/releases/past-releases/agp-7-0-0-release-notes#r8-missing-class-warning]
-dontwarn org.xmlpull.**
# dnsjava
-dontwarn com.sun.jna.**
-dontwarn lombok.**
-dontwarn javax.naming.NamingException
-dontwarn javax.naming.directory.**
-dontwarn sun.net.spi.nameservice.NameService
-dontwarn sun.net.spi.nameservice.NameServiceDescriptor
-dontwarn org.xbill.DNS.spi.DnsjavaInetAddressResolverProvider
# okhttp
# https://github.com/bitfireAT/davx5/issues/711 / https://github.com/square/okhttp/issues/8574
-keep class okhttp3.internal.idn.IdnaMappingTable { *; }
-keep class okhttp3.internal.idn.IdnaMappingTableInstanceKt{ *; }

View file

@ -0,0 +1,398 @@
{
"formatVersion": 1,
"database": {
"version": 10,
"identityHash": "6fcabe50cbd00a4215dbe536a565dd2a",
"entities": [
{
"tableName": "service",
"createSql": "CREATE TABLE IF NOT EXISTS `${TABLE_NAME}` (`id` INTEGER PRIMARY KEY AUTOINCREMENT NOT NULL, `accountName` TEXT NOT NULL, `type` TEXT NOT NULL, `principal` TEXT)",
"fields": [
{
"fieldPath": "id",
"columnName": "id",
"affinity": "INTEGER",
"notNull": true
},
{
"fieldPath": "accountName",
"columnName": "accountName",
"affinity": "TEXT",
"notNull": true
},
{
"fieldPath": "type",
"columnName": "type",
"affinity": "TEXT",
"notNull": true
},
{
"fieldPath": "principal",
"columnName": "principal",
"affinity": "TEXT",
"notNull": false
}
],
"primaryKey": {
"columnNames": [
"id"
],
"autoGenerate": true
},
"indices": [
{
"name": "index_service_accountName_type",
"unique": true,
"columnNames": [
"accountName",
"type"
],
"createSql": "CREATE UNIQUE INDEX IF NOT EXISTS `index_service_accountName_type` ON `${TABLE_NAME}` (`accountName`, `type`)"
}
],
"foreignKeys": []
},
{
"tableName": "homeset",
"createSql": "CREATE TABLE IF NOT EXISTS `${TABLE_NAME}` (`id` INTEGER PRIMARY KEY AUTOINCREMENT NOT NULL, `serviceId` INTEGER NOT NULL, `personal` INTEGER NOT NULL, `url` TEXT NOT NULL, `privBind` INTEGER NOT NULL, `displayName` TEXT, FOREIGN KEY(`serviceId`) REFERENCES `service`(`id`) ON UPDATE NO ACTION ON DELETE CASCADE )",
"fields": [
{
"fieldPath": "id",
"columnName": "id",
"affinity": "INTEGER",
"notNull": true
},
{
"fieldPath": "serviceId",
"columnName": "serviceId",
"affinity": "INTEGER",
"notNull": true
},
{
"fieldPath": "personal",
"columnName": "personal",
"affinity": "INTEGER",
"notNull": true
},
{
"fieldPath": "url",
"columnName": "url",
"affinity": "TEXT",
"notNull": true
},
{
"fieldPath": "privBind",
"columnName": "privBind",
"affinity": "INTEGER",
"notNull": true
},
{
"fieldPath": "displayName",
"columnName": "displayName",
"affinity": "TEXT",
"notNull": false
}
],
"primaryKey": {
"columnNames": [
"id"
],
"autoGenerate": true
},
"indices": [
{
"name": "index_homeset_serviceId_url",
"unique": true,
"columnNames": [
"serviceId",
"url"
],
"createSql": "CREATE UNIQUE INDEX IF NOT EXISTS `index_homeset_serviceId_url` ON `${TABLE_NAME}` (`serviceId`, `url`)"
}
],
"foreignKeys": [
{
"table": "service",
"onDelete": "CASCADE",
"onUpdate": "NO ACTION",
"columns": [
"serviceId"
],
"referencedColumns": [
"id"
]
}
]
},
{
"tableName": "collection",
"createSql": "CREATE TABLE IF NOT EXISTS `${TABLE_NAME}` (`id` INTEGER PRIMARY KEY AUTOINCREMENT NOT NULL, `serviceId` INTEGER NOT NULL, `homeSetId` INTEGER, `type` TEXT NOT NULL, `url` TEXT NOT NULL, `privWriteContent` INTEGER NOT NULL, `privUnbind` INTEGER NOT NULL, `forceReadOnly` INTEGER NOT NULL, `displayName` TEXT, `description` TEXT, `owner` TEXT, `color` INTEGER, `timezone` TEXT, `supportsVEVENT` INTEGER, `supportsVTODO` INTEGER, `supportsVJOURNAL` INTEGER, `source` TEXT, `sync` INTEGER NOT NULL, FOREIGN KEY(`serviceId`) REFERENCES `service`(`id`) ON UPDATE NO ACTION ON DELETE CASCADE , FOREIGN KEY(`homeSetId`) REFERENCES `homeset`(`id`) ON UPDATE NO ACTION ON DELETE SET NULL )",
"fields": [
{
"fieldPath": "id",
"columnName": "id",
"affinity": "INTEGER",
"notNull": true
},
{
"fieldPath": "serviceId",
"columnName": "serviceId",
"affinity": "INTEGER",
"notNull": true
},
{
"fieldPath": "homeSetId",
"columnName": "homeSetId",
"affinity": "INTEGER",
"notNull": false
},
{
"fieldPath": "type",
"columnName": "type",
"affinity": "TEXT",
"notNull": true
},
{
"fieldPath": "url",
"columnName": "url",
"affinity": "TEXT",
"notNull": true
},
{
"fieldPath": "privWriteContent",
"columnName": "privWriteContent",
"affinity": "INTEGER",
"notNull": true
},
{
"fieldPath": "privUnbind",
"columnName": "privUnbind",
"affinity": "INTEGER",
"notNull": true
},
{
"fieldPath": "forceReadOnly",
"columnName": "forceReadOnly",
"affinity": "INTEGER",
"notNull": true
},
{
"fieldPath": "displayName",
"columnName": "displayName",
"affinity": "TEXT",
"notNull": false
},
{
"fieldPath": "description",
"columnName": "description",
"affinity": "TEXT",
"notNull": false
},
{
"fieldPath": "owner",
"columnName": "owner",
"affinity": "TEXT",
"notNull": false
},
{
"fieldPath": "color",
"columnName": "color",
"affinity": "INTEGER",
"notNull": false
},
{
"fieldPath": "timezone",
"columnName": "timezone",
"affinity": "TEXT",
"notNull": false
},
{
"fieldPath": "supportsVEVENT",
"columnName": "supportsVEVENT",
"affinity": "INTEGER",
"notNull": false
},
{
"fieldPath": "supportsVTODO",
"columnName": "supportsVTODO",
"affinity": "INTEGER",
"notNull": false
},
{
"fieldPath": "supportsVJOURNAL",
"columnName": "supportsVJOURNAL",
"affinity": "INTEGER",
"notNull": false
},
{
"fieldPath": "source",
"columnName": "source",
"affinity": "TEXT",
"notNull": false
},
{
"fieldPath": "sync",
"columnName": "sync",
"affinity": "INTEGER",
"notNull": true
}
],
"primaryKey": {
"columnNames": [
"id"
],
"autoGenerate": true
},
"indices": [
{
"name": "index_collection_serviceId_type",
"unique": false,
"columnNames": [
"serviceId",
"type"
],
"createSql": "CREATE INDEX IF NOT EXISTS `index_collection_serviceId_type` ON `${TABLE_NAME}` (`serviceId`, `type`)"
},
{
"name": "index_collection_homeSetId_type",
"unique": false,
"columnNames": [
"homeSetId",
"type"
],
"createSql": "CREATE INDEX IF NOT EXISTS `index_collection_homeSetId_type` ON `${TABLE_NAME}` (`homeSetId`, `type`)"
},
{
"name": "index_collection_url",
"unique": false,
"columnNames": [
"url"
],
"createSql": "CREATE INDEX IF NOT EXISTS `index_collection_url` ON `${TABLE_NAME}` (`url`)"
}
],
"foreignKeys": [
{
"table": "service",
"onDelete": "CASCADE",
"onUpdate": "NO ACTION",
"columns": [
"serviceId"
],
"referencedColumns": [
"id"
]
},
{
"table": "homeset",
"onDelete": "SET NULL",
"onUpdate": "NO ACTION",
"columns": [
"homeSetId"
],
"referencedColumns": [
"id"
]
}
]
},
{
"tableName": "syncstats",
"createSql": "CREATE TABLE IF NOT EXISTS `${TABLE_NAME}` (`id` INTEGER PRIMARY KEY AUTOINCREMENT NOT NULL, `collectionId` INTEGER NOT NULL, `authority` TEXT NOT NULL, `lastSync` INTEGER NOT NULL, FOREIGN KEY(`collectionId`) REFERENCES `collection`(`id`) ON UPDATE NO ACTION ON DELETE CASCADE )",
"fields": [
{
"fieldPath": "id",
"columnName": "id",
"affinity": "INTEGER",
"notNull": true
},
{
"fieldPath": "collectionId",
"columnName": "collectionId",
"affinity": "INTEGER",
"notNull": true
},
{
"fieldPath": "authority",
"columnName": "authority",
"affinity": "TEXT",
"notNull": true
},
{
"fieldPath": "lastSync",
"columnName": "lastSync",
"affinity": "INTEGER",
"notNull": true
}
],
"primaryKey": {
"columnNames": [
"id"
],
"autoGenerate": true
},
"indices": [
{
"name": "index_syncstats_collectionId_authority",
"unique": true,
"columnNames": [
"collectionId",
"authority"
],
"createSql": "CREATE UNIQUE INDEX IF NOT EXISTS `index_syncstats_collectionId_authority` ON `${TABLE_NAME}` (`collectionId`, `authority`)"
}
],
"foreignKeys": [
{
"table": "collection",
"onDelete": "CASCADE",
"onUpdate": "NO ACTION",
"columns": [
"collectionId"
],
"referencedColumns": [
"id"
]
}
]
},
{
"tableName": "webdav_mount",
"createSql": "CREATE TABLE IF NOT EXISTS `${TABLE_NAME}` (`id` INTEGER PRIMARY KEY AUTOINCREMENT NOT NULL, `name` TEXT NOT NULL, `url` TEXT NOT NULL)",
"fields": [
{
"fieldPath": "id",
"columnName": "id",
"affinity": "INTEGER",
"notNull": true
},
{
"fieldPath": "name",
"columnName": "name",
"affinity": "TEXT",
"notNull": true
},
{
"fieldPath": "url",
"columnName": "url",
"affinity": "TEXT",
"notNull": true
}
],
"primaryKey": {
"columnNames": [
"id"
],
"autoGenerate": true
},
"indices": [],
"foreignKeys": []
}
],
"views": [],
"setupQueries": [
"CREATE TABLE IF NOT EXISTS room_master_table (id INTEGER PRIMARY KEY,identity_hash TEXT)",
"INSERT OR REPLACE INTO room_master_table (id,identity_hash) VALUES(42, '6fcabe50cbd00a4215dbe536a565dd2a')"
]
}
}

View file

@ -0,0 +1,536 @@
{
"formatVersion": 1,
"database": {
"version": 11,
"identityHash": "223aa7f0fd53730921ca212a663585d8",
"entities": [
{
"tableName": "service",
"createSql": "CREATE TABLE IF NOT EXISTS `${TABLE_NAME}` (`id` INTEGER PRIMARY KEY AUTOINCREMENT NOT NULL, `accountName` TEXT NOT NULL, `type` TEXT NOT NULL, `principal` TEXT)",
"fields": [
{
"fieldPath": "id",
"columnName": "id",
"affinity": "INTEGER",
"notNull": true
},
{
"fieldPath": "accountName",
"columnName": "accountName",
"affinity": "TEXT",
"notNull": true
},
{
"fieldPath": "type",
"columnName": "type",
"affinity": "TEXT",
"notNull": true
},
{
"fieldPath": "principal",
"columnName": "principal",
"affinity": "TEXT",
"notNull": false
}
],
"primaryKey": {
"columnNames": [
"id"
],
"autoGenerate": true
},
"indices": [
{
"name": "index_service_accountName_type",
"unique": true,
"columnNames": [
"accountName",
"type"
],
"createSql": "CREATE UNIQUE INDEX IF NOT EXISTS `index_service_accountName_type` ON `${TABLE_NAME}` (`accountName`, `type`)"
}
],
"foreignKeys": []
},
{
"tableName": "homeset",
"createSql": "CREATE TABLE IF NOT EXISTS `${TABLE_NAME}` (`id` INTEGER PRIMARY KEY AUTOINCREMENT NOT NULL, `serviceId` INTEGER NOT NULL, `personal` INTEGER NOT NULL, `url` TEXT NOT NULL, `privBind` INTEGER NOT NULL, `displayName` TEXT, FOREIGN KEY(`serviceId`) REFERENCES `service`(`id`) ON UPDATE NO ACTION ON DELETE CASCADE )",
"fields": [
{
"fieldPath": "id",
"columnName": "id",
"affinity": "INTEGER",
"notNull": true
},
{
"fieldPath": "serviceId",
"columnName": "serviceId",
"affinity": "INTEGER",
"notNull": true
},
{
"fieldPath": "personal",
"columnName": "personal",
"affinity": "INTEGER",
"notNull": true
},
{
"fieldPath": "url",
"columnName": "url",
"affinity": "TEXT",
"notNull": true
},
{
"fieldPath": "privBind",
"columnName": "privBind",
"affinity": "INTEGER",
"notNull": true
},
{
"fieldPath": "displayName",
"columnName": "displayName",
"affinity": "TEXT",
"notNull": false
}
],
"primaryKey": {
"columnNames": [
"id"
],
"autoGenerate": true
},
"indices": [
{
"name": "index_homeset_serviceId_url",
"unique": true,
"columnNames": [
"serviceId",
"url"
],
"createSql": "CREATE UNIQUE INDEX IF NOT EXISTS `index_homeset_serviceId_url` ON `${TABLE_NAME}` (`serviceId`, `url`)"
}
],
"foreignKeys": [
{
"table": "service",
"onDelete": "CASCADE",
"onUpdate": "NO ACTION",
"columns": [
"serviceId"
],
"referencedColumns": [
"id"
]
}
]
},
{
"tableName": "collection",
"createSql": "CREATE TABLE IF NOT EXISTS `${TABLE_NAME}` (`id` INTEGER PRIMARY KEY AUTOINCREMENT NOT NULL, `serviceId` INTEGER NOT NULL, `homeSetId` INTEGER, `type` TEXT NOT NULL, `url` TEXT NOT NULL, `privWriteContent` INTEGER NOT NULL, `privUnbind` INTEGER NOT NULL, `forceReadOnly` INTEGER NOT NULL, `displayName` TEXT, `description` TEXT, `owner` TEXT, `color` INTEGER, `timezone` TEXT, `supportsVEVENT` INTEGER, `supportsVTODO` INTEGER, `supportsVJOURNAL` INTEGER, `source` TEXT, `sync` INTEGER NOT NULL, FOREIGN KEY(`serviceId`) REFERENCES `service`(`id`) ON UPDATE NO ACTION ON DELETE CASCADE , FOREIGN KEY(`homeSetId`) REFERENCES `homeset`(`id`) ON UPDATE NO ACTION ON DELETE SET NULL )",
"fields": [
{
"fieldPath": "id",
"columnName": "id",
"affinity": "INTEGER",
"notNull": true
},
{
"fieldPath": "serviceId",
"columnName": "serviceId",
"affinity": "INTEGER",
"notNull": true
},
{
"fieldPath": "homeSetId",
"columnName": "homeSetId",
"affinity": "INTEGER",
"notNull": false
},
{
"fieldPath": "type",
"columnName": "type",
"affinity": "TEXT",
"notNull": true
},
{
"fieldPath": "url",
"columnName": "url",
"affinity": "TEXT",
"notNull": true
},
{
"fieldPath": "privWriteContent",
"columnName": "privWriteContent",
"affinity": "INTEGER",
"notNull": true
},
{
"fieldPath": "privUnbind",
"columnName": "privUnbind",
"affinity": "INTEGER",
"notNull": true
},
{
"fieldPath": "forceReadOnly",
"columnName": "forceReadOnly",
"affinity": "INTEGER",
"notNull": true
},
{
"fieldPath": "displayName",
"columnName": "displayName",
"affinity": "TEXT",
"notNull": false
},
{
"fieldPath": "description",
"columnName": "description",
"affinity": "TEXT",
"notNull": false
},
{
"fieldPath": "owner",
"columnName": "owner",
"affinity": "TEXT",
"notNull": false
},
{
"fieldPath": "color",
"columnName": "color",
"affinity": "INTEGER",
"notNull": false
},
{
"fieldPath": "timezone",
"columnName": "timezone",
"affinity": "TEXT",
"notNull": false
},
{
"fieldPath": "supportsVEVENT",
"columnName": "supportsVEVENT",
"affinity": "INTEGER",
"notNull": false
},
{
"fieldPath": "supportsVTODO",
"columnName": "supportsVTODO",
"affinity": "INTEGER",
"notNull": false
},
{
"fieldPath": "supportsVJOURNAL",
"columnName": "supportsVJOURNAL",
"affinity": "INTEGER",
"notNull": false
},
{
"fieldPath": "source",
"columnName": "source",
"affinity": "TEXT",
"notNull": false
},
{
"fieldPath": "sync",
"columnName": "sync",
"affinity": "INTEGER",
"notNull": true
}
],
"primaryKey": {
"columnNames": [
"id"
],
"autoGenerate": true
},
"indices": [
{
"name": "index_collection_serviceId_type",
"unique": false,
"columnNames": [
"serviceId",
"type"
],
"createSql": "CREATE INDEX IF NOT EXISTS `index_collection_serviceId_type` ON `${TABLE_NAME}` (`serviceId`, `type`)"
},
{
"name": "index_collection_homeSetId_type",
"unique": false,
"columnNames": [
"homeSetId",
"type"
],
"createSql": "CREATE INDEX IF NOT EXISTS `index_collection_homeSetId_type` ON `${TABLE_NAME}` (`homeSetId`, `type`)"
},
{
"name": "index_collection_url",
"unique": false,
"columnNames": [
"url"
],
"createSql": "CREATE INDEX IF NOT EXISTS `index_collection_url` ON `${TABLE_NAME}` (`url`)"
}
],
"foreignKeys": [
{
"table": "service",
"onDelete": "CASCADE",
"onUpdate": "NO ACTION",
"columns": [
"serviceId"
],
"referencedColumns": [
"id"
]
},
{
"table": "homeset",
"onDelete": "SET NULL",
"onUpdate": "NO ACTION",
"columns": [
"homeSetId"
],
"referencedColumns": [
"id"
]
}
]
},
{
"tableName": "syncstats",
"createSql": "CREATE TABLE IF NOT EXISTS `${TABLE_NAME}` (`id` INTEGER PRIMARY KEY AUTOINCREMENT NOT NULL, `collectionId` INTEGER NOT NULL, `authority` TEXT NOT NULL, `lastSync` INTEGER NOT NULL, FOREIGN KEY(`collectionId`) REFERENCES `collection`(`id`) ON UPDATE NO ACTION ON DELETE CASCADE )",
"fields": [
{
"fieldPath": "id",
"columnName": "id",
"affinity": "INTEGER",
"notNull": true
},
{
"fieldPath": "collectionId",
"columnName": "collectionId",
"affinity": "INTEGER",
"notNull": true
},
{
"fieldPath": "authority",
"columnName": "authority",
"affinity": "TEXT",
"notNull": true
},
{
"fieldPath": "lastSync",
"columnName": "lastSync",
"affinity": "INTEGER",
"notNull": true
}
],
"primaryKey": {
"columnNames": [
"id"
],
"autoGenerate": true
},
"indices": [
{
"name": "index_syncstats_collectionId_authority",
"unique": true,
"columnNames": [
"collectionId",
"authority"
],
"createSql": "CREATE UNIQUE INDEX IF NOT EXISTS `index_syncstats_collectionId_authority` ON `${TABLE_NAME}` (`collectionId`, `authority`)"
}
],
"foreignKeys": [
{
"table": "collection",
"onDelete": "CASCADE",
"onUpdate": "NO ACTION",
"columns": [
"collectionId"
],
"referencedColumns": [
"id"
]
}
]
},
{
"tableName": "webdav_document",
"createSql": "CREATE TABLE IF NOT EXISTS `${TABLE_NAME}` (`id` INTEGER PRIMARY KEY AUTOINCREMENT NOT NULL, `mountId` INTEGER NOT NULL, `parentId` INTEGER, `name` TEXT NOT NULL, `isDirectory` INTEGER NOT NULL, `displayName` TEXT, `mimeType` TEXT, `eTag` TEXT, `lastModified` INTEGER, `size` INTEGER, `mayBind` INTEGER, `mayUnbind` INTEGER, `mayWriteContent` INTEGER, `quotaAvailable` INTEGER, `quotaUsed` INTEGER, FOREIGN KEY(`mountId`) REFERENCES `webdav_mount`(`id`) ON UPDATE NO ACTION ON DELETE CASCADE , FOREIGN KEY(`parentId`) REFERENCES `webdav_document`(`id`) ON UPDATE NO ACTION ON DELETE CASCADE )",
"fields": [
{
"fieldPath": "id",
"columnName": "id",
"affinity": "INTEGER",
"notNull": true
},
{
"fieldPath": "mountId",
"columnName": "mountId",
"affinity": "INTEGER",
"notNull": true
},
{
"fieldPath": "parentId",
"columnName": "parentId",
"affinity": "INTEGER",
"notNull": false
},
{
"fieldPath": "name",
"columnName": "name",
"affinity": "TEXT",
"notNull": true
},
{
"fieldPath": "isDirectory",
"columnName": "isDirectory",
"affinity": "INTEGER",
"notNull": true
},
{
"fieldPath": "displayName",
"columnName": "displayName",
"affinity": "TEXT",
"notNull": false
},
{
"fieldPath": "mimeType",
"columnName": "mimeType",
"affinity": "TEXT",
"notNull": false
},
{
"fieldPath": "eTag",
"columnName": "eTag",
"affinity": "TEXT",
"notNull": false
},
{
"fieldPath": "lastModified",
"columnName": "lastModified",
"affinity": "INTEGER",
"notNull": false
},
{
"fieldPath": "size",
"columnName": "size",
"affinity": "INTEGER",
"notNull": false
},
{
"fieldPath": "mayBind",
"columnName": "mayBind",
"affinity": "INTEGER",
"notNull": false
},
{
"fieldPath": "mayUnbind",
"columnName": "mayUnbind",
"affinity": "INTEGER",
"notNull": false
},
{
"fieldPath": "mayWriteContent",
"columnName": "mayWriteContent",
"affinity": "INTEGER",
"notNull": false
},
{
"fieldPath": "quotaAvailable",
"columnName": "quotaAvailable",
"affinity": "INTEGER",
"notNull": false
},
{
"fieldPath": "quotaUsed",
"columnName": "quotaUsed",
"affinity": "INTEGER",
"notNull": false
}
],
"primaryKey": {
"columnNames": [
"id"
],
"autoGenerate": true
},
"indices": [
{
"name": "index_webdav_document_mountId_parentId_name",
"unique": true,
"columnNames": [
"mountId",
"parentId",
"name"
],
"createSql": "CREATE UNIQUE INDEX IF NOT EXISTS `index_webdav_document_mountId_parentId_name` ON `${TABLE_NAME}` (`mountId`, `parentId`, `name`)"
}
],
"foreignKeys": [
{
"table": "webdav_mount",
"onDelete": "CASCADE",
"onUpdate": "NO ACTION",
"columns": [
"mountId"
],
"referencedColumns": [
"id"
]
},
{
"table": "webdav_document",
"onDelete": "CASCADE",
"onUpdate": "NO ACTION",
"columns": [
"parentId"
],
"referencedColumns": [
"id"
]
}
]
},
{
"tableName": "webdav_mount",
"createSql": "CREATE TABLE IF NOT EXISTS `${TABLE_NAME}` (`id` INTEGER PRIMARY KEY AUTOINCREMENT NOT NULL, `name` TEXT NOT NULL, `url` TEXT NOT NULL)",
"fields": [
{
"fieldPath": "id",
"columnName": "id",
"affinity": "INTEGER",
"notNull": true
},
{
"fieldPath": "name",
"columnName": "name",
"affinity": "TEXT",
"notNull": true
},
{
"fieldPath": "url",
"columnName": "url",
"affinity": "TEXT",
"notNull": true
}
],
"primaryKey": {
"columnNames": [
"id"
],
"autoGenerate": true
},
"indices": [],
"foreignKeys": []
}
],
"views": [],
"setupQueries": [
"CREATE TABLE IF NOT EXISTS room_master_table (id INTEGER PRIMARY KEY,identity_hash TEXT)",
"INSERT OR REPLACE INTO room_master_table (id,identity_hash) VALUES(42, '223aa7f0fd53730921ca212a663585d8')"
]
}
}

View file

@ -0,0 +1,615 @@
{
"formatVersion": 1,
"database": {
"version": 12,
"identityHash": "67fafceecee2d97cac6a62d46fa2c3e2",
"entities": [
{
"tableName": "service",
"createSql": "CREATE TABLE IF NOT EXISTS `${TABLE_NAME}` (`id` INTEGER PRIMARY KEY AUTOINCREMENT NOT NULL, `accountName` TEXT NOT NULL, `type` TEXT NOT NULL, `principal` TEXT)",
"fields": [
{
"fieldPath": "id",
"columnName": "id",
"affinity": "INTEGER",
"notNull": true
},
{
"fieldPath": "accountName",
"columnName": "accountName",
"affinity": "TEXT",
"notNull": true
},
{
"fieldPath": "type",
"columnName": "type",
"affinity": "TEXT",
"notNull": true
},
{
"fieldPath": "principal",
"columnName": "principal",
"affinity": "TEXT",
"notNull": false
}
],
"primaryKey": {
"autoGenerate": true,
"columnNames": [
"id"
]
},
"indices": [
{
"name": "index_service_accountName_type",
"unique": true,
"columnNames": [
"accountName",
"type"
],
"orders": [],
"createSql": "CREATE UNIQUE INDEX IF NOT EXISTS `index_service_accountName_type` ON `${TABLE_NAME}` (`accountName`, `type`)"
}
],
"foreignKeys": []
},
{
"tableName": "homeset",
"createSql": "CREATE TABLE IF NOT EXISTS `${TABLE_NAME}` (`id` INTEGER PRIMARY KEY AUTOINCREMENT NOT NULL, `serviceId` INTEGER NOT NULL, `personal` INTEGER NOT NULL, `url` TEXT NOT NULL, `privBind` INTEGER NOT NULL, `displayName` TEXT, FOREIGN KEY(`serviceId`) REFERENCES `service`(`id`) ON UPDATE NO ACTION ON DELETE CASCADE )",
"fields": [
{
"fieldPath": "id",
"columnName": "id",
"affinity": "INTEGER",
"notNull": true
},
{
"fieldPath": "serviceId",
"columnName": "serviceId",
"affinity": "INTEGER",
"notNull": true
},
{
"fieldPath": "personal",
"columnName": "personal",
"affinity": "INTEGER",
"notNull": true
},
{
"fieldPath": "url",
"columnName": "url",
"affinity": "TEXT",
"notNull": true
},
{
"fieldPath": "privBind",
"columnName": "privBind",
"affinity": "INTEGER",
"notNull": true
},
{
"fieldPath": "displayName",
"columnName": "displayName",
"affinity": "TEXT",
"notNull": false
}
],
"primaryKey": {
"autoGenerate": true,
"columnNames": [
"id"
]
},
"indices": [
{
"name": "index_homeset_serviceId_url",
"unique": true,
"columnNames": [
"serviceId",
"url"
],
"orders": [],
"createSql": "CREATE UNIQUE INDEX IF NOT EXISTS `index_homeset_serviceId_url` ON `${TABLE_NAME}` (`serviceId`, `url`)"
}
],
"foreignKeys": [
{
"table": "service",
"onDelete": "CASCADE",
"onUpdate": "NO ACTION",
"columns": [
"serviceId"
],
"referencedColumns": [
"id"
]
}
]
},
{
"tableName": "collection",
"createSql": "CREATE TABLE IF NOT EXISTS `${TABLE_NAME}` (`id` INTEGER PRIMARY KEY AUTOINCREMENT NOT NULL, `serviceId` INTEGER NOT NULL, `homeSetId` INTEGER, `ownerId` INTEGER, `type` TEXT NOT NULL, `url` TEXT NOT NULL, `privWriteContent` INTEGER NOT NULL, `privUnbind` INTEGER NOT NULL, `forceReadOnly` INTEGER NOT NULL, `displayName` TEXT, `description` TEXT, `color` INTEGER, `timezone` TEXT, `supportsVEVENT` INTEGER, `supportsVTODO` INTEGER, `supportsVJOURNAL` INTEGER, `source` TEXT, `sync` INTEGER NOT NULL, FOREIGN KEY(`serviceId`) REFERENCES `service`(`id`) ON UPDATE NO ACTION ON DELETE CASCADE , FOREIGN KEY(`homeSetId`) REFERENCES `homeset`(`id`) ON UPDATE NO ACTION ON DELETE SET NULL , FOREIGN KEY(`ownerId`) REFERENCES `principal`(`id`) ON UPDATE NO ACTION ON DELETE SET NULL )",
"fields": [
{
"fieldPath": "id",
"columnName": "id",
"affinity": "INTEGER",
"notNull": true
},
{
"fieldPath": "serviceId",
"columnName": "serviceId",
"affinity": "INTEGER",
"notNull": true
},
{
"fieldPath": "homeSetId",
"columnName": "homeSetId",
"affinity": "INTEGER",
"notNull": false
},
{
"fieldPath": "ownerId",
"columnName": "ownerId",
"affinity": "INTEGER",
"notNull": false
},
{
"fieldPath": "type",
"columnName": "type",
"affinity": "TEXT",
"notNull": true
},
{
"fieldPath": "url",
"columnName": "url",
"affinity": "TEXT",
"notNull": true
},
{
"fieldPath": "privWriteContent",
"columnName": "privWriteContent",
"affinity": "INTEGER",
"notNull": true
},
{
"fieldPath": "privUnbind",
"columnName": "privUnbind",
"affinity": "INTEGER",
"notNull": true
},
{
"fieldPath": "forceReadOnly",
"columnName": "forceReadOnly",
"affinity": "INTEGER",
"notNull": true
},
{
"fieldPath": "displayName",
"columnName": "displayName",
"affinity": "TEXT",
"notNull": false
},
{
"fieldPath": "description",
"columnName": "description",
"affinity": "TEXT",
"notNull": false
},
{
"fieldPath": "color",
"columnName": "color",
"affinity": "INTEGER",
"notNull": false
},
{
"fieldPath": "timezone",
"columnName": "timezone",
"affinity": "TEXT",
"notNull": false
},
{
"fieldPath": "supportsVEVENT",
"columnName": "supportsVEVENT",
"affinity": "INTEGER",
"notNull": false
},
{
"fieldPath": "supportsVTODO",
"columnName": "supportsVTODO",
"affinity": "INTEGER",
"notNull": false
},
{
"fieldPath": "supportsVJOURNAL",
"columnName": "supportsVJOURNAL",
"affinity": "INTEGER",
"notNull": false
},
{
"fieldPath": "source",
"columnName": "source",
"affinity": "TEXT",
"notNull": false
},
{
"fieldPath": "sync",
"columnName": "sync",
"affinity": "INTEGER",
"notNull": true
}
],
"primaryKey": {
"autoGenerate": true,
"columnNames": [
"id"
]
},
"indices": [
{
"name": "index_collection_serviceId_type",
"unique": false,
"columnNames": [
"serviceId",
"type"
],
"orders": [],
"createSql": "CREATE INDEX IF NOT EXISTS `index_collection_serviceId_type` ON `${TABLE_NAME}` (`serviceId`, `type`)"
},
{
"name": "index_collection_homeSetId_type",
"unique": false,
"columnNames": [
"homeSetId",
"type"
],
"orders": [],
"createSql": "CREATE INDEX IF NOT EXISTS `index_collection_homeSetId_type` ON `${TABLE_NAME}` (`homeSetId`, `type`)"
},
{
"name": "index_collection_url",
"unique": false,
"columnNames": [
"url"
],
"orders": [],
"createSql": "CREATE INDEX IF NOT EXISTS `index_collection_url` ON `${TABLE_NAME}` (`url`)"
}
],
"foreignKeys": [
{
"table": "service",
"onDelete": "CASCADE",
"onUpdate": "NO ACTION",
"columns": [
"serviceId"
],
"referencedColumns": [
"id"
]
},
{
"table": "homeset",
"onDelete": "SET NULL",
"onUpdate": "NO ACTION",
"columns": [
"homeSetId"
],
"referencedColumns": [
"id"
]
},
{
"table": "principal",
"onDelete": "SET NULL",
"onUpdate": "NO ACTION",
"columns": [
"ownerId"
],
"referencedColumns": [
"id"
]
}
]
},
{
"tableName": "principal",
"createSql": "CREATE TABLE IF NOT EXISTS `${TABLE_NAME}` (`id` INTEGER PRIMARY KEY AUTOINCREMENT NOT NULL, `serviceId` INTEGER NOT NULL, `url` TEXT NOT NULL, `displayName` TEXT, FOREIGN KEY(`serviceId`) REFERENCES `service`(`id`) ON UPDATE NO ACTION ON DELETE CASCADE )",
"fields": [
{
"fieldPath": "id",
"columnName": "id",
"affinity": "INTEGER",
"notNull": true
},
{
"fieldPath": "serviceId",
"columnName": "serviceId",
"affinity": "INTEGER",
"notNull": true
},
{
"fieldPath": "url",
"columnName": "url",
"affinity": "TEXT",
"notNull": true
},
{
"fieldPath": "displayName",
"columnName": "displayName",
"affinity": "TEXT",
"notNull": false
}
],
"primaryKey": {
"autoGenerate": true,
"columnNames": [
"id"
]
},
"indices": [
{
"name": "index_principal_serviceId_url",
"unique": true,
"columnNames": [
"serviceId",
"url"
],
"orders": [],
"createSql": "CREATE UNIQUE INDEX IF NOT EXISTS `index_principal_serviceId_url` ON `${TABLE_NAME}` (`serviceId`, `url`)"
}
],
"foreignKeys": [
{
"table": "service",
"onDelete": "CASCADE",
"onUpdate": "NO ACTION",
"columns": [
"serviceId"
],
"referencedColumns": [
"id"
]
}
]
},
{
"tableName": "syncstats",
"createSql": "CREATE TABLE IF NOT EXISTS `${TABLE_NAME}` (`id` INTEGER PRIMARY KEY AUTOINCREMENT NOT NULL, `collectionId` INTEGER NOT NULL, `authority` TEXT NOT NULL, `lastSync` INTEGER NOT NULL, FOREIGN KEY(`collectionId`) REFERENCES `collection`(`id`) ON UPDATE NO ACTION ON DELETE CASCADE )",
"fields": [
{
"fieldPath": "id",
"columnName": "id",
"affinity": "INTEGER",
"notNull": true
},
{
"fieldPath": "collectionId",
"columnName": "collectionId",
"affinity": "INTEGER",
"notNull": true
},
{
"fieldPath": "authority",
"columnName": "authority",
"affinity": "TEXT",
"notNull": true
},
{
"fieldPath": "lastSync",
"columnName": "lastSync",
"affinity": "INTEGER",
"notNull": true
}
],
"primaryKey": {
"autoGenerate": true,
"columnNames": [
"id"
]
},
"indices": [
{
"name": "index_syncstats_collectionId_authority",
"unique": true,
"columnNames": [
"collectionId",
"authority"
],
"orders": [],
"createSql": "CREATE UNIQUE INDEX IF NOT EXISTS `index_syncstats_collectionId_authority` ON `${TABLE_NAME}` (`collectionId`, `authority`)"
}
],
"foreignKeys": [
{
"table": "collection",
"onDelete": "CASCADE",
"onUpdate": "NO ACTION",
"columns": [
"collectionId"
],
"referencedColumns": [
"id"
]
}
]
},
{
"tableName": "webdav_document",
"createSql": "CREATE TABLE IF NOT EXISTS `${TABLE_NAME}` (`id` INTEGER PRIMARY KEY AUTOINCREMENT NOT NULL, `mountId` INTEGER NOT NULL, `parentId` INTEGER, `name` TEXT NOT NULL, `isDirectory` INTEGER NOT NULL, `displayName` TEXT, `mimeType` TEXT, `eTag` TEXT, `lastModified` INTEGER, `size` INTEGER, `mayBind` INTEGER, `mayUnbind` INTEGER, `mayWriteContent` INTEGER, `quotaAvailable` INTEGER, `quotaUsed` INTEGER, FOREIGN KEY(`mountId`) REFERENCES `webdav_mount`(`id`) ON UPDATE NO ACTION ON DELETE CASCADE , FOREIGN KEY(`parentId`) REFERENCES `webdav_document`(`id`) ON UPDATE NO ACTION ON DELETE CASCADE )",
"fields": [
{
"fieldPath": "id",
"columnName": "id",
"affinity": "INTEGER",
"notNull": true
},
{
"fieldPath": "mountId",
"columnName": "mountId",
"affinity": "INTEGER",
"notNull": true
},
{
"fieldPath": "parentId",
"columnName": "parentId",
"affinity": "INTEGER",
"notNull": false
},
{
"fieldPath": "name",
"columnName": "name",
"affinity": "TEXT",
"notNull": true
},
{
"fieldPath": "isDirectory",
"columnName": "isDirectory",
"affinity": "INTEGER",
"notNull": true
},
{
"fieldPath": "displayName",
"columnName": "displayName",
"affinity": "TEXT",
"notNull": false
},
{
"fieldPath": "mimeType",
"columnName": "mimeType",
"affinity": "TEXT",
"notNull": false
},
{
"fieldPath": "eTag",
"columnName": "eTag",
"affinity": "TEXT",
"notNull": false
},
{
"fieldPath": "lastModified",
"columnName": "lastModified",
"affinity": "INTEGER",
"notNull": false
},
{
"fieldPath": "size",
"columnName": "size",
"affinity": "INTEGER",
"notNull": false
},
{
"fieldPath": "mayBind",
"columnName": "mayBind",
"affinity": "INTEGER",
"notNull": false
},
{
"fieldPath": "mayUnbind",
"columnName": "mayUnbind",
"affinity": "INTEGER",
"notNull": false
},
{
"fieldPath": "mayWriteContent",
"columnName": "mayWriteContent",
"affinity": "INTEGER",
"notNull": false
},
{
"fieldPath": "quotaAvailable",
"columnName": "quotaAvailable",
"affinity": "INTEGER",
"notNull": false
},
{
"fieldPath": "quotaUsed",
"columnName": "quotaUsed",
"affinity": "INTEGER",
"notNull": false
}
],
"primaryKey": {
"autoGenerate": true,
"columnNames": [
"id"
]
},
"indices": [
{
"name": "index_webdav_document_mountId_parentId_name",
"unique": true,
"columnNames": [
"mountId",
"parentId",
"name"
],
"orders": [],
"createSql": "CREATE UNIQUE INDEX IF NOT EXISTS `index_webdav_document_mountId_parentId_name` ON `${TABLE_NAME}` (`mountId`, `parentId`, `name`)"
}
],
"foreignKeys": [
{
"table": "webdav_mount",
"onDelete": "CASCADE",
"onUpdate": "NO ACTION",
"columns": [
"mountId"
],
"referencedColumns": [
"id"
]
},
{
"table": "webdav_document",
"onDelete": "CASCADE",
"onUpdate": "NO ACTION",
"columns": [
"parentId"
],
"referencedColumns": [
"id"
]
}
]
},
{
"tableName": "webdav_mount",
"createSql": "CREATE TABLE IF NOT EXISTS `${TABLE_NAME}` (`id` INTEGER PRIMARY KEY AUTOINCREMENT NOT NULL, `name` TEXT NOT NULL, `url` TEXT NOT NULL)",
"fields": [
{
"fieldPath": "id",
"columnName": "id",
"affinity": "INTEGER",
"notNull": true
},
{
"fieldPath": "name",
"columnName": "name",
"affinity": "TEXT",
"notNull": true
},
{
"fieldPath": "url",
"columnName": "url",
"affinity": "TEXT",
"notNull": true
}
],
"primaryKey": {
"autoGenerate": true,
"columnNames": [
"id"
]
},
"indices": [],
"foreignKeys": []
}
],
"views": [],
"setupQueries": [
"CREATE TABLE IF NOT EXISTS room_master_table (id INTEGER PRIMARY KEY,identity_hash TEXT)",
"INSERT OR REPLACE INTO room_master_table (id,identity_hash) VALUES(42, '67fafceecee2d97cac6a62d46fa2c3e2')"
]
}
}

View file

@ -0,0 +1,640 @@
{
"formatVersion": 1,
"database": {
"version": 13,
"identityHash": "0a6a9705ff471acd766ab96e3edf8ac3",
"entities": [
{
"tableName": "service",
"createSql": "CREATE TABLE IF NOT EXISTS `${TABLE_NAME}` (`id` INTEGER PRIMARY KEY AUTOINCREMENT NOT NULL, `accountName` TEXT NOT NULL, `type` TEXT NOT NULL, `principal` TEXT)",
"fields": [
{
"fieldPath": "id",
"columnName": "id",
"affinity": "INTEGER",
"notNull": true
},
{
"fieldPath": "accountName",
"columnName": "accountName",
"affinity": "TEXT",
"notNull": true
},
{
"fieldPath": "type",
"columnName": "type",
"affinity": "TEXT",
"notNull": true
},
{
"fieldPath": "principal",
"columnName": "principal",
"affinity": "TEXT",
"notNull": false
}
],
"primaryKey": {
"autoGenerate": true,
"columnNames": [
"id"
]
},
"indices": [
{
"name": "index_service_accountName_type",
"unique": true,
"columnNames": [
"accountName",
"type"
],
"orders": [],
"createSql": "CREATE UNIQUE INDEX IF NOT EXISTS `index_service_accountName_type` ON `${TABLE_NAME}` (`accountName`, `type`)"
}
],
"foreignKeys": []
},
{
"tableName": "homeset",
"createSql": "CREATE TABLE IF NOT EXISTS `${TABLE_NAME}` (`id` INTEGER PRIMARY KEY AUTOINCREMENT NOT NULL, `serviceId` INTEGER NOT NULL, `personal` INTEGER NOT NULL, `url` TEXT NOT NULL, `privBind` INTEGER NOT NULL, `displayName` TEXT, FOREIGN KEY(`serviceId`) REFERENCES `service`(`id`) ON UPDATE NO ACTION ON DELETE CASCADE )",
"fields": [
{
"fieldPath": "id",
"columnName": "id",
"affinity": "INTEGER",
"notNull": true
},
{
"fieldPath": "serviceId",
"columnName": "serviceId",
"affinity": "INTEGER",
"notNull": true
},
{
"fieldPath": "personal",
"columnName": "personal",
"affinity": "INTEGER",
"notNull": true
},
{
"fieldPath": "url",
"columnName": "url",
"affinity": "TEXT",
"notNull": true
},
{
"fieldPath": "privBind",
"columnName": "privBind",
"affinity": "INTEGER",
"notNull": true
},
{
"fieldPath": "displayName",
"columnName": "displayName",
"affinity": "TEXT",
"notNull": false
}
],
"primaryKey": {
"autoGenerate": true,
"columnNames": [
"id"
]
},
"indices": [
{
"name": "index_homeset_serviceId_url",
"unique": true,
"columnNames": [
"serviceId",
"url"
],
"orders": [],
"createSql": "CREATE UNIQUE INDEX IF NOT EXISTS `index_homeset_serviceId_url` ON `${TABLE_NAME}` (`serviceId`, `url`)"
}
],
"foreignKeys": [
{
"table": "service",
"onDelete": "CASCADE",
"onUpdate": "NO ACTION",
"columns": [
"serviceId"
],
"referencedColumns": [
"id"
]
}
]
},
{
"tableName": "collection",
"createSql": "CREATE TABLE IF NOT EXISTS `${TABLE_NAME}` (`id` INTEGER PRIMARY KEY AUTOINCREMENT NOT NULL, `serviceId` INTEGER NOT NULL, `homeSetId` INTEGER, `ownerId` INTEGER, `type` TEXT NOT NULL, `url` TEXT NOT NULL, `privWriteContent` INTEGER NOT NULL, `privUnbind` INTEGER NOT NULL, `forceReadOnly` INTEGER NOT NULL, `displayName` TEXT, `description` TEXT, `color` INTEGER, `timezone` TEXT, `supportsVEVENT` INTEGER, `supportsVTODO` INTEGER, `supportsVJOURNAL` INTEGER, `source` TEXT, `sync` INTEGER NOT NULL, `pushTopic` TEXT, `supportsWebPush` INTEGER NOT NULL DEFAULT 0, `pushSubscription` TEXT, `pushSubscriptionCreated` INTEGER, FOREIGN KEY(`serviceId`) REFERENCES `service`(`id`) ON UPDATE NO ACTION ON DELETE CASCADE , FOREIGN KEY(`homeSetId`) REFERENCES `homeset`(`id`) ON UPDATE NO ACTION ON DELETE SET NULL , FOREIGN KEY(`ownerId`) REFERENCES `principal`(`id`) ON UPDATE NO ACTION ON DELETE SET NULL )",
"fields": [
{
"fieldPath": "id",
"columnName": "id",
"affinity": "INTEGER",
"notNull": true
},
{
"fieldPath": "serviceId",
"columnName": "serviceId",
"affinity": "INTEGER",
"notNull": true
},
{
"fieldPath": "homeSetId",
"columnName": "homeSetId",
"affinity": "INTEGER",
"notNull": false
},
{
"fieldPath": "ownerId",
"columnName": "ownerId",
"affinity": "INTEGER",
"notNull": false
},
{
"fieldPath": "type",
"columnName": "type",
"affinity": "TEXT",
"notNull": true
},
{
"fieldPath": "url",
"columnName": "url",
"affinity": "TEXT",
"notNull": true
},
{
"fieldPath": "privWriteContent",
"columnName": "privWriteContent",
"affinity": "INTEGER",
"notNull": true
},
{
"fieldPath": "privUnbind",
"columnName": "privUnbind",
"affinity": "INTEGER",
"notNull": true
},
{
"fieldPath": "forceReadOnly",
"columnName": "forceReadOnly",
"affinity": "INTEGER",
"notNull": true
},
{
"fieldPath": "displayName",
"columnName": "displayName",
"affinity": "TEXT",
"notNull": false
},
{
"fieldPath": "description",
"columnName": "description",
"affinity": "TEXT",
"notNull": false
},
{
"fieldPath": "color",
"columnName": "color",
"affinity": "INTEGER",
"notNull": false
},
{
"fieldPath": "timezone",
"columnName": "timezone",
"affinity": "TEXT",
"notNull": false
},
{
"fieldPath": "supportsVEVENT",
"columnName": "supportsVEVENT",
"affinity": "INTEGER",
"notNull": false
},
{
"fieldPath": "supportsVTODO",
"columnName": "supportsVTODO",
"affinity": "INTEGER",
"notNull": false
},
{
"fieldPath": "supportsVJOURNAL",
"columnName": "supportsVJOURNAL",
"affinity": "INTEGER",
"notNull": false
},
{
"fieldPath": "source",
"columnName": "source",
"affinity": "TEXT",
"notNull": false
},
{
"fieldPath": "sync",
"columnName": "sync",
"affinity": "INTEGER",
"notNull": true
},
{
"fieldPath": "pushTopic",
"columnName": "pushTopic",
"affinity": "TEXT",
"notNull": false
},
{
"fieldPath": "supportsWebPush",
"columnName": "supportsWebPush",
"affinity": "INTEGER",
"notNull": true,
"defaultValue": "0"
},
{
"fieldPath": "pushSubscription",
"columnName": "pushSubscription",
"affinity": "TEXT",
"notNull": false
},
{
"fieldPath": "pushSubscriptionCreated",
"columnName": "pushSubscriptionCreated",
"affinity": "INTEGER",
"notNull": false
}
],
"primaryKey": {
"autoGenerate": true,
"columnNames": [
"id"
]
},
"indices": [
{
"name": "index_collection_serviceId_type",
"unique": false,
"columnNames": [
"serviceId",
"type"
],
"orders": [],
"createSql": "CREATE INDEX IF NOT EXISTS `index_collection_serviceId_type` ON `${TABLE_NAME}` (`serviceId`, `type`)"
},
{
"name": "index_collection_homeSetId_type",
"unique": false,
"columnNames": [
"homeSetId",
"type"
],
"orders": [],
"createSql": "CREATE INDEX IF NOT EXISTS `index_collection_homeSetId_type` ON `${TABLE_NAME}` (`homeSetId`, `type`)"
},
{
"name": "index_collection_url",
"unique": false,
"columnNames": [
"url"
],
"orders": [],
"createSql": "CREATE INDEX IF NOT EXISTS `index_collection_url` ON `${TABLE_NAME}` (`url`)"
}
],
"foreignKeys": [
{
"table": "service",
"onDelete": "CASCADE",
"onUpdate": "NO ACTION",
"columns": [
"serviceId"
],
"referencedColumns": [
"id"
]
},
{
"table": "homeset",
"onDelete": "SET NULL",
"onUpdate": "NO ACTION",
"columns": [
"homeSetId"
],
"referencedColumns": [
"id"
]
},
{
"table": "principal",
"onDelete": "SET NULL",
"onUpdate": "NO ACTION",
"columns": [
"ownerId"
],
"referencedColumns": [
"id"
]
}
]
},
{
"tableName": "principal",
"createSql": "CREATE TABLE IF NOT EXISTS `${TABLE_NAME}` (`id` INTEGER PRIMARY KEY AUTOINCREMENT NOT NULL, `serviceId` INTEGER NOT NULL, `url` TEXT NOT NULL, `displayName` TEXT, FOREIGN KEY(`serviceId`) REFERENCES `service`(`id`) ON UPDATE NO ACTION ON DELETE CASCADE )",
"fields": [
{
"fieldPath": "id",
"columnName": "id",
"affinity": "INTEGER",
"notNull": true
},
{
"fieldPath": "serviceId",
"columnName": "serviceId",
"affinity": "INTEGER",
"notNull": true
},
{
"fieldPath": "url",
"columnName": "url",
"affinity": "TEXT",
"notNull": true
},
{
"fieldPath": "displayName",
"columnName": "displayName",
"affinity": "TEXT",
"notNull": false
}
],
"primaryKey": {
"autoGenerate": true,
"columnNames": [
"id"
]
},
"indices": [
{
"name": "index_principal_serviceId_url",
"unique": true,
"columnNames": [
"serviceId",
"url"
],
"orders": [],
"createSql": "CREATE UNIQUE INDEX IF NOT EXISTS `index_principal_serviceId_url` ON `${TABLE_NAME}` (`serviceId`, `url`)"
}
],
"foreignKeys": [
{
"table": "service",
"onDelete": "CASCADE",
"onUpdate": "NO ACTION",
"columns": [
"serviceId"
],
"referencedColumns": [
"id"
]
}
]
},
{
"tableName": "syncstats",
"createSql": "CREATE TABLE IF NOT EXISTS `${TABLE_NAME}` (`id` INTEGER PRIMARY KEY AUTOINCREMENT NOT NULL, `collectionId` INTEGER NOT NULL, `authority` TEXT NOT NULL, `lastSync` INTEGER NOT NULL, FOREIGN KEY(`collectionId`) REFERENCES `collection`(`id`) ON UPDATE NO ACTION ON DELETE CASCADE )",
"fields": [
{
"fieldPath": "id",
"columnName": "id",
"affinity": "INTEGER",
"notNull": true
},
{
"fieldPath": "collectionId",
"columnName": "collectionId",
"affinity": "INTEGER",
"notNull": true
},
{
"fieldPath": "authority",
"columnName": "authority",
"affinity": "TEXT",
"notNull": true
},
{
"fieldPath": "lastSync",
"columnName": "lastSync",
"affinity": "INTEGER",
"notNull": true
}
],
"primaryKey": {
"autoGenerate": true,
"columnNames": [
"id"
]
},
"indices": [
{
"name": "index_syncstats_collectionId_authority",
"unique": true,
"columnNames": [
"collectionId",
"authority"
],
"orders": [],
"createSql": "CREATE UNIQUE INDEX IF NOT EXISTS `index_syncstats_collectionId_authority` ON `${TABLE_NAME}` (`collectionId`, `authority`)"
}
],
"foreignKeys": [
{
"table": "collection",
"onDelete": "CASCADE",
"onUpdate": "NO ACTION",
"columns": [
"collectionId"
],
"referencedColumns": [
"id"
]
}
]
},
{
"tableName": "webdav_document",
"createSql": "CREATE TABLE IF NOT EXISTS `${TABLE_NAME}` (`id` INTEGER PRIMARY KEY AUTOINCREMENT NOT NULL, `mountId` INTEGER NOT NULL, `parentId` INTEGER, `name` TEXT NOT NULL, `isDirectory` INTEGER NOT NULL, `displayName` TEXT, `mimeType` TEXT, `eTag` TEXT, `lastModified` INTEGER, `size` INTEGER, `mayBind` INTEGER, `mayUnbind` INTEGER, `mayWriteContent` INTEGER, `quotaAvailable` INTEGER, `quotaUsed` INTEGER, FOREIGN KEY(`mountId`) REFERENCES `webdav_mount`(`id`) ON UPDATE NO ACTION ON DELETE CASCADE , FOREIGN KEY(`parentId`) REFERENCES `webdav_document`(`id`) ON UPDATE NO ACTION ON DELETE CASCADE )",
"fields": [
{
"fieldPath": "id",
"columnName": "id",
"affinity": "INTEGER",
"notNull": true
},
{
"fieldPath": "mountId",
"columnName": "mountId",
"affinity": "INTEGER",
"notNull": true
},
{
"fieldPath": "parentId",
"columnName": "parentId",
"affinity": "INTEGER",
"notNull": false
},
{
"fieldPath": "name",
"columnName": "name",
"affinity": "TEXT",
"notNull": true
},
{
"fieldPath": "isDirectory",
"columnName": "isDirectory",
"affinity": "INTEGER",
"notNull": true
},
{
"fieldPath": "displayName",
"columnName": "displayName",
"affinity": "TEXT",
"notNull": false
},
{
"fieldPath": "mimeType",
"columnName": "mimeType",
"affinity": "TEXT",
"notNull": false
},
{
"fieldPath": "eTag",
"columnName": "eTag",
"affinity": "TEXT",
"notNull": false
},
{
"fieldPath": "lastModified",
"columnName": "lastModified",
"affinity": "INTEGER",
"notNull": false
},
{
"fieldPath": "size",
"columnName": "size",
"affinity": "INTEGER",
"notNull": false
},
{
"fieldPath": "mayBind",
"columnName": "mayBind",
"affinity": "INTEGER",
"notNull": false
},
{
"fieldPath": "mayUnbind",
"columnName": "mayUnbind",
"affinity": "INTEGER",
"notNull": false
},
{
"fieldPath": "mayWriteContent",
"columnName": "mayWriteContent",
"affinity": "INTEGER",
"notNull": false
},
{
"fieldPath": "quotaAvailable",
"columnName": "quotaAvailable",
"affinity": "INTEGER",
"notNull": false
},
{
"fieldPath": "quotaUsed",
"columnName": "quotaUsed",
"affinity": "INTEGER",
"notNull": false
}
],
"primaryKey": {
"autoGenerate": true,
"columnNames": [
"id"
]
},
"indices": [
{
"name": "index_webdav_document_mountId_parentId_name",
"unique": true,
"columnNames": [
"mountId",
"parentId",
"name"
],
"orders": [],
"createSql": "CREATE UNIQUE INDEX IF NOT EXISTS `index_webdav_document_mountId_parentId_name` ON `${TABLE_NAME}` (`mountId`, `parentId`, `name`)"
}
],
"foreignKeys": [
{
"table": "webdav_mount",
"onDelete": "CASCADE",
"onUpdate": "NO ACTION",
"columns": [
"mountId"
],
"referencedColumns": [
"id"
]
},
{
"table": "webdav_document",
"onDelete": "CASCADE",
"onUpdate": "NO ACTION",
"columns": [
"parentId"
],
"referencedColumns": [
"id"
]
}
]
},
{
"tableName": "webdav_mount",
"createSql": "CREATE TABLE IF NOT EXISTS `${TABLE_NAME}` (`id` INTEGER PRIMARY KEY AUTOINCREMENT NOT NULL, `name` TEXT NOT NULL, `url` TEXT NOT NULL)",
"fields": [
{
"fieldPath": "id",
"columnName": "id",
"affinity": "INTEGER",
"notNull": true
},
{
"fieldPath": "name",
"columnName": "name",
"affinity": "TEXT",
"notNull": true
},
{
"fieldPath": "url",
"columnName": "url",
"affinity": "TEXT",
"notNull": true
}
],
"primaryKey": {
"autoGenerate": true,
"columnNames": [
"id"
]
},
"indices": [],
"foreignKeys": []
}
],
"views": [],
"setupQueries": [
"CREATE TABLE IF NOT EXISTS room_master_table (id INTEGER PRIMARY KEY,identity_hash TEXT)",
"INSERT OR REPLACE INTO room_master_table (id,identity_hash) VALUES(42, '0a6a9705ff471acd766ab96e3edf8ac3')"
]
}
}

View file

@ -0,0 +1,669 @@
{
"formatVersion": 1,
"database": {
"version": 14,
"identityHash": "9a0eb47f27473eab254db568081a4585",
"entities": [
{
"tableName": "service",
"createSql": "CREATE TABLE IF NOT EXISTS `${TABLE_NAME}` (`id` INTEGER PRIMARY KEY AUTOINCREMENT NOT NULL, `accountName` TEXT NOT NULL, `type` TEXT NOT NULL, `principal` TEXT)",
"fields": [
{
"fieldPath": "id",
"columnName": "id",
"affinity": "INTEGER",
"notNull": true
},
{
"fieldPath": "accountName",
"columnName": "accountName",
"affinity": "TEXT",
"notNull": true
},
{
"fieldPath": "type",
"columnName": "type",
"affinity": "TEXT",
"notNull": true
},
{
"fieldPath": "principal",
"columnName": "principal",
"affinity": "TEXT",
"notNull": false
}
],
"primaryKey": {
"autoGenerate": true,
"columnNames": [
"id"
]
},
"indices": [
{
"name": "index_service_accountName_type",
"unique": true,
"columnNames": [
"accountName",
"type"
],
"orders": [],
"createSql": "CREATE UNIQUE INDEX IF NOT EXISTS `index_service_accountName_type` ON `${TABLE_NAME}` (`accountName`, `type`)"
}
],
"foreignKeys": []
},
{
"tableName": "homeset",
"createSql": "CREATE TABLE IF NOT EXISTS `${TABLE_NAME}` (`id` INTEGER PRIMARY KEY AUTOINCREMENT NOT NULL, `serviceId` INTEGER NOT NULL, `personal` INTEGER NOT NULL, `url` TEXT NOT NULL, `privBind` INTEGER NOT NULL, `displayName` TEXT, FOREIGN KEY(`serviceId`) REFERENCES `service`(`id`) ON UPDATE NO ACTION ON DELETE CASCADE )",
"fields": [
{
"fieldPath": "id",
"columnName": "id",
"affinity": "INTEGER",
"notNull": true
},
{
"fieldPath": "serviceId",
"columnName": "serviceId",
"affinity": "INTEGER",
"notNull": true
},
{
"fieldPath": "personal",
"columnName": "personal",
"affinity": "INTEGER",
"notNull": true
},
{
"fieldPath": "url",
"columnName": "url",
"affinity": "TEXT",
"notNull": true
},
{
"fieldPath": "privBind",
"columnName": "privBind",
"affinity": "INTEGER",
"notNull": true
},
{
"fieldPath": "displayName",
"columnName": "displayName",
"affinity": "TEXT",
"notNull": false
}
],
"primaryKey": {
"autoGenerate": true,
"columnNames": [
"id"
]
},
"indices": [
{
"name": "index_homeset_serviceId_url",
"unique": true,
"columnNames": [
"serviceId",
"url"
],
"orders": [],
"createSql": "CREATE UNIQUE INDEX IF NOT EXISTS `index_homeset_serviceId_url` ON `${TABLE_NAME}` (`serviceId`, `url`)"
}
],
"foreignKeys": [
{
"table": "service",
"onDelete": "CASCADE",
"onUpdate": "NO ACTION",
"columns": [
"serviceId"
],
"referencedColumns": [
"id"
]
}
]
},
{
"tableName": "collection",
"createSql": "CREATE TABLE IF NOT EXISTS `${TABLE_NAME}` (`id` INTEGER PRIMARY KEY AUTOINCREMENT NOT NULL, `serviceId` INTEGER NOT NULL, `homeSetId` INTEGER, `ownerId` INTEGER, `type` TEXT NOT NULL, `url` TEXT NOT NULL, `privWriteContent` INTEGER NOT NULL, `privUnbind` INTEGER NOT NULL, `forceReadOnly` INTEGER NOT NULL, `displayName` TEXT, `description` TEXT, `color` INTEGER, `timezone` TEXT, `supportsVEVENT` INTEGER, `supportsVTODO` INTEGER, `supportsVJOURNAL` INTEGER, `source` TEXT, `sync` INTEGER NOT NULL, `pushTopic` TEXT, `supportsWebPush` INTEGER NOT NULL DEFAULT 0, `pushSubscription` TEXT, `pushSubscriptionCreated` INTEGER, FOREIGN KEY(`serviceId`) REFERENCES `service`(`id`) ON UPDATE NO ACTION ON DELETE CASCADE , FOREIGN KEY(`homeSetId`) REFERENCES `homeset`(`id`) ON UPDATE NO ACTION ON DELETE SET NULL , FOREIGN KEY(`ownerId`) REFERENCES `principal`(`id`) ON UPDATE NO ACTION ON DELETE SET NULL )",
"fields": [
{
"fieldPath": "id",
"columnName": "id",
"affinity": "INTEGER",
"notNull": true
},
{
"fieldPath": "serviceId",
"columnName": "serviceId",
"affinity": "INTEGER",
"notNull": true
},
{
"fieldPath": "homeSetId",
"columnName": "homeSetId",
"affinity": "INTEGER",
"notNull": false
},
{
"fieldPath": "ownerId",
"columnName": "ownerId",
"affinity": "INTEGER",
"notNull": false
},
{
"fieldPath": "type",
"columnName": "type",
"affinity": "TEXT",
"notNull": true
},
{
"fieldPath": "url",
"columnName": "url",
"affinity": "TEXT",
"notNull": true
},
{
"fieldPath": "privWriteContent",
"columnName": "privWriteContent",
"affinity": "INTEGER",
"notNull": true
},
{
"fieldPath": "privUnbind",
"columnName": "privUnbind",
"affinity": "INTEGER",
"notNull": true
},
{
"fieldPath": "forceReadOnly",
"columnName": "forceReadOnly",
"affinity": "INTEGER",
"notNull": true
},
{
"fieldPath": "displayName",
"columnName": "displayName",
"affinity": "TEXT",
"notNull": false
},
{
"fieldPath": "description",
"columnName": "description",
"affinity": "TEXT",
"notNull": false
},
{
"fieldPath": "color",
"columnName": "color",
"affinity": "INTEGER",
"notNull": false
},
{
"fieldPath": "timezone",
"columnName": "timezone",
"affinity": "TEXT",
"notNull": false
},
{
"fieldPath": "supportsVEVENT",
"columnName": "supportsVEVENT",
"affinity": "INTEGER",
"notNull": false
},
{
"fieldPath": "supportsVTODO",
"columnName": "supportsVTODO",
"affinity": "INTEGER",
"notNull": false
},
{
"fieldPath": "supportsVJOURNAL",
"columnName": "supportsVJOURNAL",
"affinity": "INTEGER",
"notNull": false
},
{
"fieldPath": "source",
"columnName": "source",
"affinity": "TEXT",
"notNull": false
},
{
"fieldPath": "sync",
"columnName": "sync",
"affinity": "INTEGER",
"notNull": true
},
{
"fieldPath": "pushTopic",
"columnName": "pushTopic",
"affinity": "TEXT",
"notNull": false
},
{
"fieldPath": "supportsWebPush",
"columnName": "supportsWebPush",
"affinity": "INTEGER",
"notNull": true,
"defaultValue": "0"
},
{
"fieldPath": "pushSubscription",
"columnName": "pushSubscription",
"affinity": "TEXT",
"notNull": false
},
{
"fieldPath": "pushSubscriptionCreated",
"columnName": "pushSubscriptionCreated",
"affinity": "INTEGER",
"notNull": false
}
],
"primaryKey": {
"autoGenerate": true,
"columnNames": [
"id"
]
},
"indices": [
{
"name": "index_collection_serviceId_type",
"unique": false,
"columnNames": [
"serviceId",
"type"
],
"orders": [],
"createSql": "CREATE INDEX IF NOT EXISTS `index_collection_serviceId_type` ON `${TABLE_NAME}` (`serviceId`, `type`)"
},
{
"name": "index_collection_homeSetId_type",
"unique": false,
"columnNames": [
"homeSetId",
"type"
],
"orders": [],
"createSql": "CREATE INDEX IF NOT EXISTS `index_collection_homeSetId_type` ON `${TABLE_NAME}` (`homeSetId`, `type`)"
},
{
"name": "index_collection_ownerId_type",
"unique": false,
"columnNames": [
"ownerId",
"type"
],
"orders": [],
"createSql": "CREATE INDEX IF NOT EXISTS `index_collection_ownerId_type` ON `${TABLE_NAME}` (`ownerId`, `type`)"
},
{
"name": "index_collection_pushTopic_type",
"unique": false,
"columnNames": [
"pushTopic",
"type"
],
"orders": [],
"createSql": "CREATE INDEX IF NOT EXISTS `index_collection_pushTopic_type` ON `${TABLE_NAME}` (`pushTopic`, `type`)"
},
{
"name": "index_collection_url",
"unique": false,
"columnNames": [
"url"
],
"orders": [],
"createSql": "CREATE INDEX IF NOT EXISTS `index_collection_url` ON `${TABLE_NAME}` (`url`)"
}
],
"foreignKeys": [
{
"table": "service",
"onDelete": "CASCADE",
"onUpdate": "NO ACTION",
"columns": [
"serviceId"
],
"referencedColumns": [
"id"
]
},
{
"table": "homeset",
"onDelete": "SET NULL",
"onUpdate": "NO ACTION",
"columns": [
"homeSetId"
],
"referencedColumns": [
"id"
]
},
{
"table": "principal",
"onDelete": "SET NULL",
"onUpdate": "NO ACTION",
"columns": [
"ownerId"
],
"referencedColumns": [
"id"
]
}
]
},
{
"tableName": "principal",
"createSql": "CREATE TABLE IF NOT EXISTS `${TABLE_NAME}` (`id` INTEGER PRIMARY KEY AUTOINCREMENT NOT NULL, `serviceId` INTEGER NOT NULL, `url` TEXT NOT NULL, `displayName` TEXT, FOREIGN KEY(`serviceId`) REFERENCES `service`(`id`) ON UPDATE NO ACTION ON DELETE CASCADE )",
"fields": [
{
"fieldPath": "id",
"columnName": "id",
"affinity": "INTEGER",
"notNull": true
},
{
"fieldPath": "serviceId",
"columnName": "serviceId",
"affinity": "INTEGER",
"notNull": true
},
{
"fieldPath": "url",
"columnName": "url",
"affinity": "TEXT",
"notNull": true
},
{
"fieldPath": "displayName",
"columnName": "displayName",
"affinity": "TEXT",
"notNull": false
}
],
"primaryKey": {
"autoGenerate": true,
"columnNames": [
"id"
]
},
"indices": [
{
"name": "index_principal_serviceId_url",
"unique": true,
"columnNames": [
"serviceId",
"url"
],
"orders": [],
"createSql": "CREATE UNIQUE INDEX IF NOT EXISTS `index_principal_serviceId_url` ON `${TABLE_NAME}` (`serviceId`, `url`)"
}
],
"foreignKeys": [
{
"table": "service",
"onDelete": "CASCADE",
"onUpdate": "NO ACTION",
"columns": [
"serviceId"
],
"referencedColumns": [
"id"
]
}
]
},
{
"tableName": "syncstats",
"createSql": "CREATE TABLE IF NOT EXISTS `${TABLE_NAME}` (`id` INTEGER PRIMARY KEY AUTOINCREMENT NOT NULL, `collectionId` INTEGER NOT NULL, `authority` TEXT NOT NULL, `lastSync` INTEGER NOT NULL, FOREIGN KEY(`collectionId`) REFERENCES `collection`(`id`) ON UPDATE NO ACTION ON DELETE CASCADE )",
"fields": [
{
"fieldPath": "id",
"columnName": "id",
"affinity": "INTEGER",
"notNull": true
},
{
"fieldPath": "collectionId",
"columnName": "collectionId",
"affinity": "INTEGER",
"notNull": true
},
{
"fieldPath": "authority",
"columnName": "authority",
"affinity": "TEXT",
"notNull": true
},
{
"fieldPath": "lastSync",
"columnName": "lastSync",
"affinity": "INTEGER",
"notNull": true
}
],
"primaryKey": {
"autoGenerate": true,
"columnNames": [
"id"
]
},
"indices": [
{
"name": "index_syncstats_collectionId_authority",
"unique": true,
"columnNames": [
"collectionId",
"authority"
],
"orders": [],
"createSql": "CREATE UNIQUE INDEX IF NOT EXISTS `index_syncstats_collectionId_authority` ON `${TABLE_NAME}` (`collectionId`, `authority`)"
}
],
"foreignKeys": [
{
"table": "collection",
"onDelete": "CASCADE",
"onUpdate": "NO ACTION",
"columns": [
"collectionId"
],
"referencedColumns": [
"id"
]
}
]
},
{
"tableName": "webdav_document",
"createSql": "CREATE TABLE IF NOT EXISTS `${TABLE_NAME}` (`id` INTEGER PRIMARY KEY AUTOINCREMENT NOT NULL, `mountId` INTEGER NOT NULL, `parentId` INTEGER, `name` TEXT NOT NULL, `isDirectory` INTEGER NOT NULL, `displayName` TEXT, `mimeType` TEXT, `eTag` TEXT, `lastModified` INTEGER, `size` INTEGER, `mayBind` INTEGER, `mayUnbind` INTEGER, `mayWriteContent` INTEGER, `quotaAvailable` INTEGER, `quotaUsed` INTEGER, FOREIGN KEY(`mountId`) REFERENCES `webdav_mount`(`id`) ON UPDATE NO ACTION ON DELETE CASCADE , FOREIGN KEY(`parentId`) REFERENCES `webdav_document`(`id`) ON UPDATE NO ACTION ON DELETE CASCADE )",
"fields": [
{
"fieldPath": "id",
"columnName": "id",
"affinity": "INTEGER",
"notNull": true
},
{
"fieldPath": "mountId",
"columnName": "mountId",
"affinity": "INTEGER",
"notNull": true
},
{
"fieldPath": "parentId",
"columnName": "parentId",
"affinity": "INTEGER",
"notNull": false
},
{
"fieldPath": "name",
"columnName": "name",
"affinity": "TEXT",
"notNull": true
},
{
"fieldPath": "isDirectory",
"columnName": "isDirectory",
"affinity": "INTEGER",
"notNull": true
},
{
"fieldPath": "displayName",
"columnName": "displayName",
"affinity": "TEXT",
"notNull": false
},
{
"fieldPath": "mimeType",
"columnName": "mimeType",
"affinity": "TEXT",
"notNull": false
},
{
"fieldPath": "eTag",
"columnName": "eTag",
"affinity": "TEXT",
"notNull": false
},
{
"fieldPath": "lastModified",
"columnName": "lastModified",
"affinity": "INTEGER",
"notNull": false
},
{
"fieldPath": "size",
"columnName": "size",
"affinity": "INTEGER",
"notNull": false
},
{
"fieldPath": "mayBind",
"columnName": "mayBind",
"affinity": "INTEGER",
"notNull": false
},
{
"fieldPath": "mayUnbind",
"columnName": "mayUnbind",
"affinity": "INTEGER",
"notNull": false
},
{
"fieldPath": "mayWriteContent",
"columnName": "mayWriteContent",
"affinity": "INTEGER",
"notNull": false
},
{
"fieldPath": "quotaAvailable",
"columnName": "quotaAvailable",
"affinity": "INTEGER",
"notNull": false
},
{
"fieldPath": "quotaUsed",
"columnName": "quotaUsed",
"affinity": "INTEGER",
"notNull": false
}
],
"primaryKey": {
"autoGenerate": true,
"columnNames": [
"id"
]
},
"indices": [
{
"name": "index_webdav_document_mountId_parentId_name",
"unique": true,
"columnNames": [
"mountId",
"parentId",
"name"
],
"orders": [],
"createSql": "CREATE UNIQUE INDEX IF NOT EXISTS `index_webdav_document_mountId_parentId_name` ON `${TABLE_NAME}` (`mountId`, `parentId`, `name`)"
},
{
"name": "index_webdav_document_parentId",
"unique": false,
"columnNames": [
"parentId"
],
"orders": [],
"createSql": "CREATE INDEX IF NOT EXISTS `index_webdav_document_parentId` ON `${TABLE_NAME}` (`parentId`)"
}
],
"foreignKeys": [
{
"table": "webdav_mount",
"onDelete": "CASCADE",
"onUpdate": "NO ACTION",
"columns": [
"mountId"
],
"referencedColumns": [
"id"
]
},
{
"table": "webdav_document",
"onDelete": "CASCADE",
"onUpdate": "NO ACTION",
"columns": [
"parentId"
],
"referencedColumns": [
"id"
]
}
]
},
{
"tableName": "webdav_mount",
"createSql": "CREATE TABLE IF NOT EXISTS `${TABLE_NAME}` (`id` INTEGER PRIMARY KEY AUTOINCREMENT NOT NULL, `name` TEXT NOT NULL, `url` TEXT NOT NULL)",
"fields": [
{
"fieldPath": "id",
"columnName": "id",
"affinity": "INTEGER",
"notNull": true
},
{
"fieldPath": "name",
"columnName": "name",
"affinity": "TEXT",
"notNull": true
},
{
"fieldPath": "url",
"columnName": "url",
"affinity": "TEXT",
"notNull": true
}
],
"primaryKey": {
"autoGenerate": true,
"columnNames": [
"id"
]
},
"indices": [],
"foreignKeys": []
}
],
"views": [],
"setupQueries": [
"CREATE TABLE IF NOT EXISTS room_master_table (id INTEGER PRIMARY KEY,identity_hash TEXT)",
"INSERT OR REPLACE INTO room_master_table (id,identity_hash) VALUES(42, '9a0eb47f27473eab254db568081a4585')"
]
}
}

View file

@ -0,0 +1,675 @@
{
"formatVersion": 1,
"database": {
"version": 15,
"identityHash": "ab1cb6057d8e050f6648bea46ae0943d",
"entities": [
{
"tableName": "service",
"createSql": "CREATE TABLE IF NOT EXISTS `${TABLE_NAME}` (`id` INTEGER PRIMARY KEY AUTOINCREMENT NOT NULL, `accountName` TEXT NOT NULL, `type` TEXT NOT NULL, `principal` TEXT)",
"fields": [
{
"fieldPath": "id",
"columnName": "id",
"affinity": "INTEGER",
"notNull": true
},
{
"fieldPath": "accountName",
"columnName": "accountName",
"affinity": "TEXT",
"notNull": true
},
{
"fieldPath": "type",
"columnName": "type",
"affinity": "TEXT",
"notNull": true
},
{
"fieldPath": "principal",
"columnName": "principal",
"affinity": "TEXT",
"notNull": false
}
],
"primaryKey": {
"autoGenerate": true,
"columnNames": [
"id"
]
},
"indices": [
{
"name": "index_service_accountName_type",
"unique": true,
"columnNames": [
"accountName",
"type"
],
"orders": [],
"createSql": "CREATE UNIQUE INDEX IF NOT EXISTS `index_service_accountName_type` ON `${TABLE_NAME}` (`accountName`, `type`)"
}
],
"foreignKeys": []
},
{
"tableName": "homeset",
"createSql": "CREATE TABLE IF NOT EXISTS `${TABLE_NAME}` (`id` INTEGER PRIMARY KEY AUTOINCREMENT NOT NULL, `serviceId` INTEGER NOT NULL, `personal` INTEGER NOT NULL, `url` TEXT NOT NULL, `privBind` INTEGER NOT NULL, `displayName` TEXT, FOREIGN KEY(`serviceId`) REFERENCES `service`(`id`) ON UPDATE NO ACTION ON DELETE CASCADE )",
"fields": [
{
"fieldPath": "id",
"columnName": "id",
"affinity": "INTEGER",
"notNull": true
},
{
"fieldPath": "serviceId",
"columnName": "serviceId",
"affinity": "INTEGER",
"notNull": true
},
{
"fieldPath": "personal",
"columnName": "personal",
"affinity": "INTEGER",
"notNull": true
},
{
"fieldPath": "url",
"columnName": "url",
"affinity": "TEXT",
"notNull": true
},
{
"fieldPath": "privBind",
"columnName": "privBind",
"affinity": "INTEGER",
"notNull": true
},
{
"fieldPath": "displayName",
"columnName": "displayName",
"affinity": "TEXT",
"notNull": false
}
],
"primaryKey": {
"autoGenerate": true,
"columnNames": [
"id"
]
},
"indices": [
{
"name": "index_homeset_serviceId_url",
"unique": true,
"columnNames": [
"serviceId",
"url"
],
"orders": [],
"createSql": "CREATE UNIQUE INDEX IF NOT EXISTS `index_homeset_serviceId_url` ON `${TABLE_NAME}` (`serviceId`, `url`)"
}
],
"foreignKeys": [
{
"table": "service",
"onDelete": "CASCADE",
"onUpdate": "NO ACTION",
"columns": [
"serviceId"
],
"referencedColumns": [
"id"
]
}
]
},
{
"tableName": "collection",
"createSql": "CREATE TABLE IF NOT EXISTS `${TABLE_NAME}` (`id` INTEGER PRIMARY KEY AUTOINCREMENT NOT NULL, `serviceId` INTEGER NOT NULL, `homeSetId` INTEGER, `ownerId` INTEGER, `type` TEXT NOT NULL, `url` TEXT NOT NULL, `privWriteContent` INTEGER NOT NULL, `privUnbind` INTEGER NOT NULL, `forceReadOnly` INTEGER NOT NULL, `displayName` TEXT, `description` TEXT, `color` INTEGER, `timezone` TEXT, `supportsVEVENT` INTEGER, `supportsVTODO` INTEGER, `supportsVJOURNAL` INTEGER, `source` TEXT, `sync` INTEGER NOT NULL, `pushTopic` TEXT, `supportsWebPush` INTEGER NOT NULL DEFAULT 0, `pushSubscription` TEXT, `pushSubscriptionExpires` INTEGER, `pushSubscriptionCreated` INTEGER, FOREIGN KEY(`serviceId`) REFERENCES `service`(`id`) ON UPDATE NO ACTION ON DELETE CASCADE , FOREIGN KEY(`homeSetId`) REFERENCES `homeset`(`id`) ON UPDATE NO ACTION ON DELETE SET NULL , FOREIGN KEY(`ownerId`) REFERENCES `principal`(`id`) ON UPDATE NO ACTION ON DELETE SET NULL )",
"fields": [
{
"fieldPath": "id",
"columnName": "id",
"affinity": "INTEGER",
"notNull": true
},
{
"fieldPath": "serviceId",
"columnName": "serviceId",
"affinity": "INTEGER",
"notNull": true
},
{
"fieldPath": "homeSetId",
"columnName": "homeSetId",
"affinity": "INTEGER",
"notNull": false
},
{
"fieldPath": "ownerId",
"columnName": "ownerId",
"affinity": "INTEGER",
"notNull": false
},
{
"fieldPath": "type",
"columnName": "type",
"affinity": "TEXT",
"notNull": true
},
{
"fieldPath": "url",
"columnName": "url",
"affinity": "TEXT",
"notNull": true
},
{
"fieldPath": "privWriteContent",
"columnName": "privWriteContent",
"affinity": "INTEGER",
"notNull": true
},
{
"fieldPath": "privUnbind",
"columnName": "privUnbind",
"affinity": "INTEGER",
"notNull": true
},
{
"fieldPath": "forceReadOnly",
"columnName": "forceReadOnly",
"affinity": "INTEGER",
"notNull": true
},
{
"fieldPath": "displayName",
"columnName": "displayName",
"affinity": "TEXT",
"notNull": false
},
{
"fieldPath": "description",
"columnName": "description",
"affinity": "TEXT",
"notNull": false
},
{
"fieldPath": "color",
"columnName": "color",
"affinity": "INTEGER",
"notNull": false
},
{
"fieldPath": "timezone",
"columnName": "timezone",
"affinity": "TEXT",
"notNull": false
},
{
"fieldPath": "supportsVEVENT",
"columnName": "supportsVEVENT",
"affinity": "INTEGER",
"notNull": false
},
{
"fieldPath": "supportsVTODO",
"columnName": "supportsVTODO",
"affinity": "INTEGER",
"notNull": false
},
{
"fieldPath": "supportsVJOURNAL",
"columnName": "supportsVJOURNAL",
"affinity": "INTEGER",
"notNull": false
},
{
"fieldPath": "source",
"columnName": "source",
"affinity": "TEXT",
"notNull": false
},
{
"fieldPath": "sync",
"columnName": "sync",
"affinity": "INTEGER",
"notNull": true
},
{
"fieldPath": "pushTopic",
"columnName": "pushTopic",
"affinity": "TEXT",
"notNull": false
},
{
"fieldPath": "supportsWebPush",
"columnName": "supportsWebPush",
"affinity": "INTEGER",
"notNull": true,
"defaultValue": "0"
},
{
"fieldPath": "pushSubscription",
"columnName": "pushSubscription",
"affinity": "TEXT",
"notNull": false
},
{
"fieldPath": "pushSubscriptionExpires",
"columnName": "pushSubscriptionExpires",
"affinity": "INTEGER",
"notNull": false
},
{
"fieldPath": "pushSubscriptionCreated",
"columnName": "pushSubscriptionCreated",
"affinity": "INTEGER",
"notNull": false
}
],
"primaryKey": {
"autoGenerate": true,
"columnNames": [
"id"
]
},
"indices": [
{
"name": "index_collection_serviceId_type",
"unique": false,
"columnNames": [
"serviceId",
"type"
],
"orders": [],
"createSql": "CREATE INDEX IF NOT EXISTS `index_collection_serviceId_type` ON `${TABLE_NAME}` (`serviceId`, `type`)"
},
{
"name": "index_collection_homeSetId_type",
"unique": false,
"columnNames": [
"homeSetId",
"type"
],
"orders": [],
"createSql": "CREATE INDEX IF NOT EXISTS `index_collection_homeSetId_type` ON `${TABLE_NAME}` (`homeSetId`, `type`)"
},
{
"name": "index_collection_ownerId_type",
"unique": false,
"columnNames": [
"ownerId",
"type"
],
"orders": [],
"createSql": "CREATE INDEX IF NOT EXISTS `index_collection_ownerId_type` ON `${TABLE_NAME}` (`ownerId`, `type`)"
},
{
"name": "index_collection_pushTopic_type",
"unique": false,
"columnNames": [
"pushTopic",
"type"
],
"orders": [],
"createSql": "CREATE INDEX IF NOT EXISTS `index_collection_pushTopic_type` ON `${TABLE_NAME}` (`pushTopic`, `type`)"
},
{
"name": "index_collection_url",
"unique": false,
"columnNames": [
"url"
],
"orders": [],
"createSql": "CREATE INDEX IF NOT EXISTS `index_collection_url` ON `${TABLE_NAME}` (`url`)"
}
],
"foreignKeys": [
{
"table": "service",
"onDelete": "CASCADE",
"onUpdate": "NO ACTION",
"columns": [
"serviceId"
],
"referencedColumns": [
"id"
]
},
{
"table": "homeset",
"onDelete": "SET NULL",
"onUpdate": "NO ACTION",
"columns": [
"homeSetId"
],
"referencedColumns": [
"id"
]
},
{
"table": "principal",
"onDelete": "SET NULL",
"onUpdate": "NO ACTION",
"columns": [
"ownerId"
],
"referencedColumns": [
"id"
]
}
]
},
{
"tableName": "principal",
"createSql": "CREATE TABLE IF NOT EXISTS `${TABLE_NAME}` (`id` INTEGER PRIMARY KEY AUTOINCREMENT NOT NULL, `serviceId` INTEGER NOT NULL, `url` TEXT NOT NULL, `displayName` TEXT, FOREIGN KEY(`serviceId`) REFERENCES `service`(`id`) ON UPDATE NO ACTION ON DELETE CASCADE )",
"fields": [
{
"fieldPath": "id",
"columnName": "id",
"affinity": "INTEGER",
"notNull": true
},
{
"fieldPath": "serviceId",
"columnName": "serviceId",
"affinity": "INTEGER",
"notNull": true
},
{
"fieldPath": "url",
"columnName": "url",
"affinity": "TEXT",
"notNull": true
},
{
"fieldPath": "displayName",
"columnName": "displayName",
"affinity": "TEXT",
"notNull": false
}
],
"primaryKey": {
"autoGenerate": true,
"columnNames": [
"id"
]
},
"indices": [
{
"name": "index_principal_serviceId_url",
"unique": true,
"columnNames": [
"serviceId",
"url"
],
"orders": [],
"createSql": "CREATE UNIQUE INDEX IF NOT EXISTS `index_principal_serviceId_url` ON `${TABLE_NAME}` (`serviceId`, `url`)"
}
],
"foreignKeys": [
{
"table": "service",
"onDelete": "CASCADE",
"onUpdate": "NO ACTION",
"columns": [
"serviceId"
],
"referencedColumns": [
"id"
]
}
]
},
{
"tableName": "syncstats",
"createSql": "CREATE TABLE IF NOT EXISTS `${TABLE_NAME}` (`id` INTEGER PRIMARY KEY AUTOINCREMENT NOT NULL, `collectionId` INTEGER NOT NULL, `authority` TEXT NOT NULL, `lastSync` INTEGER NOT NULL, FOREIGN KEY(`collectionId`) REFERENCES `collection`(`id`) ON UPDATE NO ACTION ON DELETE CASCADE )",
"fields": [
{
"fieldPath": "id",
"columnName": "id",
"affinity": "INTEGER",
"notNull": true
},
{
"fieldPath": "collectionId",
"columnName": "collectionId",
"affinity": "INTEGER",
"notNull": true
},
{
"fieldPath": "authority",
"columnName": "authority",
"affinity": "TEXT",
"notNull": true
},
{
"fieldPath": "lastSync",
"columnName": "lastSync",
"affinity": "INTEGER",
"notNull": true
}
],
"primaryKey": {
"autoGenerate": true,
"columnNames": [
"id"
]
},
"indices": [
{
"name": "index_syncstats_collectionId_authority",
"unique": true,
"columnNames": [
"collectionId",
"authority"
],
"orders": [],
"createSql": "CREATE UNIQUE INDEX IF NOT EXISTS `index_syncstats_collectionId_authority` ON `${TABLE_NAME}` (`collectionId`, `authority`)"
}
],
"foreignKeys": [
{
"table": "collection",
"onDelete": "CASCADE",
"onUpdate": "NO ACTION",
"columns": [
"collectionId"
],
"referencedColumns": [
"id"
]
}
]
},
{
"tableName": "webdav_document",
"createSql": "CREATE TABLE IF NOT EXISTS `${TABLE_NAME}` (`id` INTEGER PRIMARY KEY AUTOINCREMENT NOT NULL, `mountId` INTEGER NOT NULL, `parentId` INTEGER, `name` TEXT NOT NULL, `isDirectory` INTEGER NOT NULL, `displayName` TEXT, `mimeType` TEXT, `eTag` TEXT, `lastModified` INTEGER, `size` INTEGER, `mayBind` INTEGER, `mayUnbind` INTEGER, `mayWriteContent` INTEGER, `quotaAvailable` INTEGER, `quotaUsed` INTEGER, FOREIGN KEY(`mountId`) REFERENCES `webdav_mount`(`id`) ON UPDATE NO ACTION ON DELETE CASCADE , FOREIGN KEY(`parentId`) REFERENCES `webdav_document`(`id`) ON UPDATE NO ACTION ON DELETE CASCADE )",
"fields": [
{
"fieldPath": "id",
"columnName": "id",
"affinity": "INTEGER",
"notNull": true
},
{
"fieldPath": "mountId",
"columnName": "mountId",
"affinity": "INTEGER",
"notNull": true
},
{
"fieldPath": "parentId",
"columnName": "parentId",
"affinity": "INTEGER",
"notNull": false
},
{
"fieldPath": "name",
"columnName": "name",
"affinity": "TEXT",
"notNull": true
},
{
"fieldPath": "isDirectory",
"columnName": "isDirectory",
"affinity": "INTEGER",
"notNull": true
},
{
"fieldPath": "displayName",
"columnName": "displayName",
"affinity": "TEXT",
"notNull": false
},
{
"fieldPath": "mimeType",
"columnName": "mimeType",
"affinity": "TEXT",
"notNull": false
},
{
"fieldPath": "eTag",
"columnName": "eTag",
"affinity": "TEXT",
"notNull": false
},
{
"fieldPath": "lastModified",
"columnName": "lastModified",
"affinity": "INTEGER",
"notNull": false
},
{
"fieldPath": "size",
"columnName": "size",
"affinity": "INTEGER",
"notNull": false
},
{
"fieldPath": "mayBind",
"columnName": "mayBind",
"affinity": "INTEGER",
"notNull": false
},
{
"fieldPath": "mayUnbind",
"columnName": "mayUnbind",
"affinity": "INTEGER",
"notNull": false
},
{
"fieldPath": "mayWriteContent",
"columnName": "mayWriteContent",
"affinity": "INTEGER",
"notNull": false
},
{
"fieldPath": "quotaAvailable",
"columnName": "quotaAvailable",
"affinity": "INTEGER",
"notNull": false
},
{
"fieldPath": "quotaUsed",
"columnName": "quotaUsed",
"affinity": "INTEGER",
"notNull": false
}
],
"primaryKey": {
"autoGenerate": true,
"columnNames": [
"id"
]
},
"indices": [
{
"name": "index_webdav_document_mountId_parentId_name",
"unique": true,
"columnNames": [
"mountId",
"parentId",
"name"
],
"orders": [],
"createSql": "CREATE UNIQUE INDEX IF NOT EXISTS `index_webdav_document_mountId_parentId_name` ON `${TABLE_NAME}` (`mountId`, `parentId`, `name`)"
},
{
"name": "index_webdav_document_parentId",
"unique": false,
"columnNames": [
"parentId"
],
"orders": [],
"createSql": "CREATE INDEX IF NOT EXISTS `index_webdav_document_parentId` ON `${TABLE_NAME}` (`parentId`)"
}
],
"foreignKeys": [
{
"table": "webdav_mount",
"onDelete": "CASCADE",
"onUpdate": "NO ACTION",
"columns": [
"mountId"
],
"referencedColumns": [
"id"
]
},
{
"table": "webdav_document",
"onDelete": "CASCADE",
"onUpdate": "NO ACTION",
"columns": [
"parentId"
],
"referencedColumns": [
"id"
]
}
]
},
{
"tableName": "webdav_mount",
"createSql": "CREATE TABLE IF NOT EXISTS `${TABLE_NAME}` (`id` INTEGER PRIMARY KEY AUTOINCREMENT NOT NULL, `name` TEXT NOT NULL, `url` TEXT NOT NULL)",
"fields": [
{
"fieldPath": "id",
"columnName": "id",
"affinity": "INTEGER",
"notNull": true
},
{
"fieldPath": "name",
"columnName": "name",
"affinity": "TEXT",
"notNull": true
},
{
"fieldPath": "url",
"columnName": "url",
"affinity": "TEXT",
"notNull": true
}
],
"primaryKey": {
"autoGenerate": true,
"columnNames": [
"id"
]
},
"indices": [],
"foreignKeys": []
}
],
"views": [],
"setupQueries": [
"CREATE TABLE IF NOT EXISTS room_master_table (id INTEGER PRIMARY KEY,identity_hash TEXT)",
"INSERT OR REPLACE INTO room_master_table (id,identity_hash) VALUES(42, 'ab1cb6057d8e050f6648bea46ae0943d')"
]
}
}

View file

@ -0,0 +1,675 @@
{
"formatVersion": 1,
"database": {
"version": 16,
"identityHash": "2ff7560d957e03a78b4b7de88aa9593b",
"entities": [
{
"tableName": "service",
"createSql": "CREATE TABLE IF NOT EXISTS `${TABLE_NAME}` (`id` INTEGER PRIMARY KEY AUTOINCREMENT NOT NULL, `accountName` TEXT NOT NULL, `type` TEXT NOT NULL, `principal` TEXT)",
"fields": [
{
"fieldPath": "id",
"columnName": "id",
"affinity": "INTEGER",
"notNull": true
},
{
"fieldPath": "accountName",
"columnName": "accountName",
"affinity": "TEXT",
"notNull": true
},
{
"fieldPath": "type",
"columnName": "type",
"affinity": "TEXT",
"notNull": true
},
{
"fieldPath": "principal",
"columnName": "principal",
"affinity": "TEXT",
"notNull": false
}
],
"primaryKey": {
"autoGenerate": true,
"columnNames": [
"id"
]
},
"indices": [
{
"name": "index_service_accountName_type",
"unique": true,
"columnNames": [
"accountName",
"type"
],
"orders": [],
"createSql": "CREATE UNIQUE INDEX IF NOT EXISTS `index_service_accountName_type` ON `${TABLE_NAME}` (`accountName`, `type`)"
}
],
"foreignKeys": []
},
{
"tableName": "homeset",
"createSql": "CREATE TABLE IF NOT EXISTS `${TABLE_NAME}` (`id` INTEGER PRIMARY KEY AUTOINCREMENT NOT NULL, `serviceId` INTEGER NOT NULL, `personal` INTEGER NOT NULL, `url` TEXT NOT NULL, `privBind` INTEGER NOT NULL, `displayName` TEXT, FOREIGN KEY(`serviceId`) REFERENCES `service`(`id`) ON UPDATE NO ACTION ON DELETE CASCADE )",
"fields": [
{
"fieldPath": "id",
"columnName": "id",
"affinity": "INTEGER",
"notNull": true
},
{
"fieldPath": "serviceId",
"columnName": "serviceId",
"affinity": "INTEGER",
"notNull": true
},
{
"fieldPath": "personal",
"columnName": "personal",
"affinity": "INTEGER",
"notNull": true
},
{
"fieldPath": "url",
"columnName": "url",
"affinity": "TEXT",
"notNull": true
},
{
"fieldPath": "privBind",
"columnName": "privBind",
"affinity": "INTEGER",
"notNull": true
},
{
"fieldPath": "displayName",
"columnName": "displayName",
"affinity": "TEXT",
"notNull": false
}
],
"primaryKey": {
"autoGenerate": true,
"columnNames": [
"id"
]
},
"indices": [
{
"name": "index_homeset_serviceId_url",
"unique": true,
"columnNames": [
"serviceId",
"url"
],
"orders": [],
"createSql": "CREATE UNIQUE INDEX IF NOT EXISTS `index_homeset_serviceId_url` ON `${TABLE_NAME}` (`serviceId`, `url`)"
}
],
"foreignKeys": [
{
"table": "service",
"onDelete": "CASCADE",
"onUpdate": "NO ACTION",
"columns": [
"serviceId"
],
"referencedColumns": [
"id"
]
}
]
},
{
"tableName": "collection",
"createSql": "CREATE TABLE IF NOT EXISTS `${TABLE_NAME}` (`id` INTEGER PRIMARY KEY AUTOINCREMENT NOT NULL, `serviceId` INTEGER NOT NULL, `homeSetId` INTEGER, `ownerId` INTEGER, `type` TEXT NOT NULL, `url` TEXT NOT NULL, `privWriteContent` INTEGER NOT NULL, `privUnbind` INTEGER NOT NULL, `forceReadOnly` INTEGER NOT NULL, `displayName` TEXT, `description` TEXT, `color` INTEGER, `timezoneId` TEXT, `supportsVEVENT` INTEGER, `supportsVTODO` INTEGER, `supportsVJOURNAL` INTEGER, `source` TEXT, `sync` INTEGER NOT NULL, `pushTopic` TEXT, `supportsWebPush` INTEGER NOT NULL DEFAULT 0, `pushSubscription` TEXT, `pushSubscriptionExpires` INTEGER, `pushSubscriptionCreated` INTEGER, FOREIGN KEY(`serviceId`) REFERENCES `service`(`id`) ON UPDATE NO ACTION ON DELETE CASCADE , FOREIGN KEY(`homeSetId`) REFERENCES `homeset`(`id`) ON UPDATE NO ACTION ON DELETE SET NULL , FOREIGN KEY(`ownerId`) REFERENCES `principal`(`id`) ON UPDATE NO ACTION ON DELETE SET NULL )",
"fields": [
{
"fieldPath": "id",
"columnName": "id",
"affinity": "INTEGER",
"notNull": true
},
{
"fieldPath": "serviceId",
"columnName": "serviceId",
"affinity": "INTEGER",
"notNull": true
},
{
"fieldPath": "homeSetId",
"columnName": "homeSetId",
"affinity": "INTEGER",
"notNull": false
},
{
"fieldPath": "ownerId",
"columnName": "ownerId",
"affinity": "INTEGER",
"notNull": false
},
{
"fieldPath": "type",
"columnName": "type",
"affinity": "TEXT",
"notNull": true
},
{
"fieldPath": "url",
"columnName": "url",
"affinity": "TEXT",
"notNull": true
},
{
"fieldPath": "privWriteContent",
"columnName": "privWriteContent",
"affinity": "INTEGER",
"notNull": true
},
{
"fieldPath": "privUnbind",
"columnName": "privUnbind",
"affinity": "INTEGER",
"notNull": true
},
{
"fieldPath": "forceReadOnly",
"columnName": "forceReadOnly",
"affinity": "INTEGER",
"notNull": true
},
{
"fieldPath": "displayName",
"columnName": "displayName",
"affinity": "TEXT",
"notNull": false
},
{
"fieldPath": "description",
"columnName": "description",
"affinity": "TEXT",
"notNull": false
},
{
"fieldPath": "color",
"columnName": "color",
"affinity": "INTEGER",
"notNull": false
},
{
"fieldPath": "timezoneId",
"columnName": "timezoneId",
"affinity": "TEXT",
"notNull": false
},
{
"fieldPath": "supportsVEVENT",
"columnName": "supportsVEVENT",
"affinity": "INTEGER",
"notNull": false
},
{
"fieldPath": "supportsVTODO",
"columnName": "supportsVTODO",
"affinity": "INTEGER",
"notNull": false
},
{
"fieldPath": "supportsVJOURNAL",
"columnName": "supportsVJOURNAL",
"affinity": "INTEGER",
"notNull": false
},
{
"fieldPath": "source",
"columnName": "source",
"affinity": "TEXT",
"notNull": false
},
{
"fieldPath": "sync",
"columnName": "sync",
"affinity": "INTEGER",
"notNull": true
},
{
"fieldPath": "pushTopic",
"columnName": "pushTopic",
"affinity": "TEXT",
"notNull": false
},
{
"fieldPath": "supportsWebPush",
"columnName": "supportsWebPush",
"affinity": "INTEGER",
"notNull": true,
"defaultValue": "0"
},
{
"fieldPath": "pushSubscription",
"columnName": "pushSubscription",
"affinity": "TEXT",
"notNull": false
},
{
"fieldPath": "pushSubscriptionExpires",
"columnName": "pushSubscriptionExpires",
"affinity": "INTEGER",
"notNull": false
},
{
"fieldPath": "pushSubscriptionCreated",
"columnName": "pushSubscriptionCreated",
"affinity": "INTEGER",
"notNull": false
}
],
"primaryKey": {
"autoGenerate": true,
"columnNames": [
"id"
]
},
"indices": [
{
"name": "index_collection_serviceId_type",
"unique": false,
"columnNames": [
"serviceId",
"type"
],
"orders": [],
"createSql": "CREATE INDEX IF NOT EXISTS `index_collection_serviceId_type` ON `${TABLE_NAME}` (`serviceId`, `type`)"
},
{
"name": "index_collection_homeSetId_type",
"unique": false,
"columnNames": [
"homeSetId",
"type"
],
"orders": [],
"createSql": "CREATE INDEX IF NOT EXISTS `index_collection_homeSetId_type` ON `${TABLE_NAME}` (`homeSetId`, `type`)"
},
{
"name": "index_collection_ownerId_type",
"unique": false,
"columnNames": [
"ownerId",
"type"
],
"orders": [],
"createSql": "CREATE INDEX IF NOT EXISTS `index_collection_ownerId_type` ON `${TABLE_NAME}` (`ownerId`, `type`)"
},
{
"name": "index_collection_pushTopic_type",
"unique": false,
"columnNames": [
"pushTopic",
"type"
],
"orders": [],
"createSql": "CREATE INDEX IF NOT EXISTS `index_collection_pushTopic_type` ON `${TABLE_NAME}` (`pushTopic`, `type`)"
},
{
"name": "index_collection_url",
"unique": false,
"columnNames": [
"url"
],
"orders": [],
"createSql": "CREATE INDEX IF NOT EXISTS `index_collection_url` ON `${TABLE_NAME}` (`url`)"
}
],
"foreignKeys": [
{
"table": "service",
"onDelete": "CASCADE",
"onUpdate": "NO ACTION",
"columns": [
"serviceId"
],
"referencedColumns": [
"id"
]
},
{
"table": "homeset",
"onDelete": "SET NULL",
"onUpdate": "NO ACTION",
"columns": [
"homeSetId"
],
"referencedColumns": [
"id"
]
},
{
"table": "principal",
"onDelete": "SET NULL",
"onUpdate": "NO ACTION",
"columns": [
"ownerId"
],
"referencedColumns": [
"id"
]
}
]
},
{
"tableName": "principal",
"createSql": "CREATE TABLE IF NOT EXISTS `${TABLE_NAME}` (`id` INTEGER PRIMARY KEY AUTOINCREMENT NOT NULL, `serviceId` INTEGER NOT NULL, `url` TEXT NOT NULL, `displayName` TEXT, FOREIGN KEY(`serviceId`) REFERENCES `service`(`id`) ON UPDATE NO ACTION ON DELETE CASCADE )",
"fields": [
{
"fieldPath": "id",
"columnName": "id",
"affinity": "INTEGER",
"notNull": true
},
{
"fieldPath": "serviceId",
"columnName": "serviceId",
"affinity": "INTEGER",
"notNull": true
},
{
"fieldPath": "url",
"columnName": "url",
"affinity": "TEXT",
"notNull": true
},
{
"fieldPath": "displayName",
"columnName": "displayName",
"affinity": "TEXT",
"notNull": false
}
],
"primaryKey": {
"autoGenerate": true,
"columnNames": [
"id"
]
},
"indices": [
{
"name": "index_principal_serviceId_url",
"unique": true,
"columnNames": [
"serviceId",
"url"
],
"orders": [],
"createSql": "CREATE UNIQUE INDEX IF NOT EXISTS `index_principal_serviceId_url` ON `${TABLE_NAME}` (`serviceId`, `url`)"
}
],
"foreignKeys": [
{
"table": "service",
"onDelete": "CASCADE",
"onUpdate": "NO ACTION",
"columns": [
"serviceId"
],
"referencedColumns": [
"id"
]
}
]
},
{
"tableName": "syncstats",
"createSql": "CREATE TABLE IF NOT EXISTS `${TABLE_NAME}` (`id` INTEGER PRIMARY KEY AUTOINCREMENT NOT NULL, `collectionId` INTEGER NOT NULL, `authority` TEXT NOT NULL, `lastSync` INTEGER NOT NULL, FOREIGN KEY(`collectionId`) REFERENCES `collection`(`id`) ON UPDATE NO ACTION ON DELETE CASCADE )",
"fields": [
{
"fieldPath": "id",
"columnName": "id",
"affinity": "INTEGER",
"notNull": true
},
{
"fieldPath": "collectionId",
"columnName": "collectionId",
"affinity": "INTEGER",
"notNull": true
},
{
"fieldPath": "authority",
"columnName": "authority",
"affinity": "TEXT",
"notNull": true
},
{
"fieldPath": "lastSync",
"columnName": "lastSync",
"affinity": "INTEGER",
"notNull": true
}
],
"primaryKey": {
"autoGenerate": true,
"columnNames": [
"id"
]
},
"indices": [
{
"name": "index_syncstats_collectionId_authority",
"unique": true,
"columnNames": [
"collectionId",
"authority"
],
"orders": [],
"createSql": "CREATE UNIQUE INDEX IF NOT EXISTS `index_syncstats_collectionId_authority` ON `${TABLE_NAME}` (`collectionId`, `authority`)"
}
],
"foreignKeys": [
{
"table": "collection",
"onDelete": "CASCADE",
"onUpdate": "NO ACTION",
"columns": [
"collectionId"
],
"referencedColumns": [
"id"
]
}
]
},
{
"tableName": "webdav_document",
"createSql": "CREATE TABLE IF NOT EXISTS `${TABLE_NAME}` (`id` INTEGER PRIMARY KEY AUTOINCREMENT NOT NULL, `mountId` INTEGER NOT NULL, `parentId` INTEGER, `name` TEXT NOT NULL, `isDirectory` INTEGER NOT NULL, `displayName` TEXT, `mimeType` TEXT, `eTag` TEXT, `lastModified` INTEGER, `size` INTEGER, `mayBind` INTEGER, `mayUnbind` INTEGER, `mayWriteContent` INTEGER, `quotaAvailable` INTEGER, `quotaUsed` INTEGER, FOREIGN KEY(`mountId`) REFERENCES `webdav_mount`(`id`) ON UPDATE NO ACTION ON DELETE CASCADE , FOREIGN KEY(`parentId`) REFERENCES `webdav_document`(`id`) ON UPDATE NO ACTION ON DELETE CASCADE )",
"fields": [
{
"fieldPath": "id",
"columnName": "id",
"affinity": "INTEGER",
"notNull": true
},
{
"fieldPath": "mountId",
"columnName": "mountId",
"affinity": "INTEGER",
"notNull": true
},
{
"fieldPath": "parentId",
"columnName": "parentId",
"affinity": "INTEGER",
"notNull": false
},
{
"fieldPath": "name",
"columnName": "name",
"affinity": "TEXT",
"notNull": true
},
{
"fieldPath": "isDirectory",
"columnName": "isDirectory",
"affinity": "INTEGER",
"notNull": true
},
{
"fieldPath": "displayName",
"columnName": "displayName",
"affinity": "TEXT",
"notNull": false
},
{
"fieldPath": "mimeType",
"columnName": "mimeType",
"affinity": "TEXT",
"notNull": false
},
{
"fieldPath": "eTag",
"columnName": "eTag",
"affinity": "TEXT",
"notNull": false
},
{
"fieldPath": "lastModified",
"columnName": "lastModified",
"affinity": "INTEGER",
"notNull": false
},
{
"fieldPath": "size",
"columnName": "size",
"affinity": "INTEGER",
"notNull": false
},
{
"fieldPath": "mayBind",
"columnName": "mayBind",
"affinity": "INTEGER",
"notNull": false
},
{
"fieldPath": "mayUnbind",
"columnName": "mayUnbind",
"affinity": "INTEGER",
"notNull": false
},
{
"fieldPath": "mayWriteContent",
"columnName": "mayWriteContent",
"affinity": "INTEGER",
"notNull": false
},
{
"fieldPath": "quotaAvailable",
"columnName": "quotaAvailable",
"affinity": "INTEGER",
"notNull": false
},
{
"fieldPath": "quotaUsed",
"columnName": "quotaUsed",
"affinity": "INTEGER",
"notNull": false
}
],
"primaryKey": {
"autoGenerate": true,
"columnNames": [
"id"
]
},
"indices": [
{
"name": "index_webdav_document_mountId_parentId_name",
"unique": true,
"columnNames": [
"mountId",
"parentId",
"name"
],
"orders": [],
"createSql": "CREATE UNIQUE INDEX IF NOT EXISTS `index_webdav_document_mountId_parentId_name` ON `${TABLE_NAME}` (`mountId`, `parentId`, `name`)"
},
{
"name": "index_webdav_document_parentId",
"unique": false,
"columnNames": [
"parentId"
],
"orders": [],
"createSql": "CREATE INDEX IF NOT EXISTS `index_webdav_document_parentId` ON `${TABLE_NAME}` (`parentId`)"
}
],
"foreignKeys": [
{
"table": "webdav_mount",
"onDelete": "CASCADE",
"onUpdate": "NO ACTION",
"columns": [
"mountId"
],
"referencedColumns": [
"id"
]
},
{
"table": "webdav_document",
"onDelete": "CASCADE",
"onUpdate": "NO ACTION",
"columns": [
"parentId"
],
"referencedColumns": [
"id"
]
}
]
},
{
"tableName": "webdav_mount",
"createSql": "CREATE TABLE IF NOT EXISTS `${TABLE_NAME}` (`id` INTEGER PRIMARY KEY AUTOINCREMENT NOT NULL, `name` TEXT NOT NULL, `url` TEXT NOT NULL)",
"fields": [
{
"fieldPath": "id",
"columnName": "id",
"affinity": "INTEGER",
"notNull": true
},
{
"fieldPath": "name",
"columnName": "name",
"affinity": "TEXT",
"notNull": true
},
{
"fieldPath": "url",
"columnName": "url",
"affinity": "TEXT",
"notNull": true
}
],
"primaryKey": {
"autoGenerate": true,
"columnNames": [
"id"
]
},
"indices": [],
"foreignKeys": []
}
],
"views": [],
"setupQueries": [
"CREATE TABLE IF NOT EXISTS room_master_table (id INTEGER PRIMARY KEY,identity_hash TEXT)",
"INSERT OR REPLACE INTO room_master_table (id,identity_hash) VALUES(42, '2ff7560d957e03a78b4b7de88aa9593b')"
]
}
}

View file

@ -0,0 +1,648 @@
{
"formatVersion": 1,
"database": {
"version": 17,
"identityHash": "cd15d368408570cc2e57252816869de2",
"entities": [
{
"tableName": "service",
"createSql": "CREATE TABLE IF NOT EXISTS `${TABLE_NAME}` (`id` INTEGER PRIMARY KEY AUTOINCREMENT NOT NULL, `accountName` TEXT NOT NULL, `type` TEXT NOT NULL, `principal` TEXT)",
"fields": [
{
"fieldPath": "id",
"columnName": "id",
"affinity": "INTEGER",
"notNull": true
},
{
"fieldPath": "accountName",
"columnName": "accountName",
"affinity": "TEXT",
"notNull": true
},
{
"fieldPath": "type",
"columnName": "type",
"affinity": "TEXT",
"notNull": true
},
{
"fieldPath": "principal",
"columnName": "principal",
"affinity": "TEXT"
}
],
"primaryKey": {
"autoGenerate": true,
"columnNames": [
"id"
]
},
"indices": [
{
"name": "index_service_accountName_type",
"unique": true,
"columnNames": [
"accountName",
"type"
],
"orders": [],
"createSql": "CREATE UNIQUE INDEX IF NOT EXISTS `index_service_accountName_type` ON `${TABLE_NAME}` (`accountName`, `type`)"
}
]
},
{
"tableName": "homeset",
"createSql": "CREATE TABLE IF NOT EXISTS `${TABLE_NAME}` (`id` INTEGER PRIMARY KEY AUTOINCREMENT NOT NULL, `serviceId` INTEGER NOT NULL, `personal` INTEGER NOT NULL, `url` TEXT NOT NULL, `privBind` INTEGER NOT NULL, `displayName` TEXT, FOREIGN KEY(`serviceId`) REFERENCES `service`(`id`) ON UPDATE NO ACTION ON DELETE CASCADE )",
"fields": [
{
"fieldPath": "id",
"columnName": "id",
"affinity": "INTEGER",
"notNull": true
},
{
"fieldPath": "serviceId",
"columnName": "serviceId",
"affinity": "INTEGER",
"notNull": true
},
{
"fieldPath": "personal",
"columnName": "personal",
"affinity": "INTEGER",
"notNull": true
},
{
"fieldPath": "url",
"columnName": "url",
"affinity": "TEXT",
"notNull": true
},
{
"fieldPath": "privBind",
"columnName": "privBind",
"affinity": "INTEGER",
"notNull": true
},
{
"fieldPath": "displayName",
"columnName": "displayName",
"affinity": "TEXT"
}
],
"primaryKey": {
"autoGenerate": true,
"columnNames": [
"id"
]
},
"indices": [
{
"name": "index_homeset_serviceId_url",
"unique": true,
"columnNames": [
"serviceId",
"url"
],
"orders": [],
"createSql": "CREATE UNIQUE INDEX IF NOT EXISTS `index_homeset_serviceId_url` ON `${TABLE_NAME}` (`serviceId`, `url`)"
}
],
"foreignKeys": [
{
"table": "service",
"onDelete": "CASCADE",
"onUpdate": "NO ACTION",
"columns": [
"serviceId"
],
"referencedColumns": [
"id"
]
}
]
},
{
"tableName": "collection",
"createSql": "CREATE TABLE IF NOT EXISTS `${TABLE_NAME}` (`id` INTEGER PRIMARY KEY AUTOINCREMENT NOT NULL, `serviceId` INTEGER NOT NULL, `homeSetId` INTEGER, `ownerId` INTEGER, `type` TEXT NOT NULL, `url` TEXT NOT NULL, `privWriteContent` INTEGER NOT NULL, `privUnbind` INTEGER NOT NULL, `forceReadOnly` INTEGER NOT NULL, `displayName` TEXT, `description` TEXT, `color` INTEGER, `timezoneId` TEXT, `supportsVEVENT` INTEGER, `supportsVTODO` INTEGER, `supportsVJOURNAL` INTEGER, `source` TEXT, `sync` INTEGER NOT NULL, `pushTopic` TEXT, `supportsWebPush` INTEGER NOT NULL DEFAULT 0, `pushVapidKey` TEXT, `pushSubscription` TEXT, `pushSubscriptionExpires` INTEGER, `pushSubscriptionCreated` INTEGER, FOREIGN KEY(`serviceId`) REFERENCES `service`(`id`) ON UPDATE NO ACTION ON DELETE CASCADE , FOREIGN KEY(`homeSetId`) REFERENCES `homeset`(`id`) ON UPDATE NO ACTION ON DELETE SET NULL , FOREIGN KEY(`ownerId`) REFERENCES `principal`(`id`) ON UPDATE NO ACTION ON DELETE SET NULL )",
"fields": [
{
"fieldPath": "id",
"columnName": "id",
"affinity": "INTEGER",
"notNull": true
},
{
"fieldPath": "serviceId",
"columnName": "serviceId",
"affinity": "INTEGER",
"notNull": true
},
{
"fieldPath": "homeSetId",
"columnName": "homeSetId",
"affinity": "INTEGER"
},
{
"fieldPath": "ownerId",
"columnName": "ownerId",
"affinity": "INTEGER"
},
{
"fieldPath": "type",
"columnName": "type",
"affinity": "TEXT",
"notNull": true
},
{
"fieldPath": "url",
"columnName": "url",
"affinity": "TEXT",
"notNull": true
},
{
"fieldPath": "privWriteContent",
"columnName": "privWriteContent",
"affinity": "INTEGER",
"notNull": true
},
{
"fieldPath": "privUnbind",
"columnName": "privUnbind",
"affinity": "INTEGER",
"notNull": true
},
{
"fieldPath": "forceReadOnly",
"columnName": "forceReadOnly",
"affinity": "INTEGER",
"notNull": true
},
{
"fieldPath": "displayName",
"columnName": "displayName",
"affinity": "TEXT"
},
{
"fieldPath": "description",
"columnName": "description",
"affinity": "TEXT"
},
{
"fieldPath": "color",
"columnName": "color",
"affinity": "INTEGER"
},
{
"fieldPath": "timezoneId",
"columnName": "timezoneId",
"affinity": "TEXT"
},
{
"fieldPath": "supportsVEVENT",
"columnName": "supportsVEVENT",
"affinity": "INTEGER"
},
{
"fieldPath": "supportsVTODO",
"columnName": "supportsVTODO",
"affinity": "INTEGER"
},
{
"fieldPath": "supportsVJOURNAL",
"columnName": "supportsVJOURNAL",
"affinity": "INTEGER"
},
{
"fieldPath": "source",
"columnName": "source",
"affinity": "TEXT"
},
{
"fieldPath": "sync",
"columnName": "sync",
"affinity": "INTEGER",
"notNull": true
},
{
"fieldPath": "pushTopic",
"columnName": "pushTopic",
"affinity": "TEXT"
},
{
"fieldPath": "supportsWebPush",
"columnName": "supportsWebPush",
"affinity": "INTEGER",
"notNull": true,
"defaultValue": "0"
},
{
"fieldPath": "pushVapidKey",
"columnName": "pushVapidKey",
"affinity": "TEXT"
},
{
"fieldPath": "pushSubscription",
"columnName": "pushSubscription",
"affinity": "TEXT"
},
{
"fieldPath": "pushSubscriptionExpires",
"columnName": "pushSubscriptionExpires",
"affinity": "INTEGER"
},
{
"fieldPath": "pushSubscriptionCreated",
"columnName": "pushSubscriptionCreated",
"affinity": "INTEGER"
}
],
"primaryKey": {
"autoGenerate": true,
"columnNames": [
"id"
]
},
"indices": [
{
"name": "index_collection_serviceId_type",
"unique": false,
"columnNames": [
"serviceId",
"type"
],
"orders": [],
"createSql": "CREATE INDEX IF NOT EXISTS `index_collection_serviceId_type` ON `${TABLE_NAME}` (`serviceId`, `type`)"
},
{
"name": "index_collection_homeSetId_type",
"unique": false,
"columnNames": [
"homeSetId",
"type"
],
"orders": [],
"createSql": "CREATE INDEX IF NOT EXISTS `index_collection_homeSetId_type` ON `${TABLE_NAME}` (`homeSetId`, `type`)"
},
{
"name": "index_collection_ownerId_type",
"unique": false,
"columnNames": [
"ownerId",
"type"
],
"orders": [],
"createSql": "CREATE INDEX IF NOT EXISTS `index_collection_ownerId_type` ON `${TABLE_NAME}` (`ownerId`, `type`)"
},
{
"name": "index_collection_pushTopic_type",
"unique": false,
"columnNames": [
"pushTopic",
"type"
],
"orders": [],
"createSql": "CREATE INDEX IF NOT EXISTS `index_collection_pushTopic_type` ON `${TABLE_NAME}` (`pushTopic`, `type`)"
},
{
"name": "index_collection_url",
"unique": false,
"columnNames": [
"url"
],
"orders": [],
"createSql": "CREATE INDEX IF NOT EXISTS `index_collection_url` ON `${TABLE_NAME}` (`url`)"
}
],
"foreignKeys": [
{
"table": "service",
"onDelete": "CASCADE",
"onUpdate": "NO ACTION",
"columns": [
"serviceId"
],
"referencedColumns": [
"id"
]
},
{
"table": "homeset",
"onDelete": "SET NULL",
"onUpdate": "NO ACTION",
"columns": [
"homeSetId"
],
"referencedColumns": [
"id"
]
},
{
"table": "principal",
"onDelete": "SET NULL",
"onUpdate": "NO ACTION",
"columns": [
"ownerId"
],
"referencedColumns": [
"id"
]
}
]
},
{
"tableName": "principal",
"createSql": "CREATE TABLE IF NOT EXISTS `${TABLE_NAME}` (`id` INTEGER PRIMARY KEY AUTOINCREMENT NOT NULL, `serviceId` INTEGER NOT NULL, `url` TEXT NOT NULL, `displayName` TEXT, FOREIGN KEY(`serviceId`) REFERENCES `service`(`id`) ON UPDATE NO ACTION ON DELETE CASCADE )",
"fields": [
{
"fieldPath": "id",
"columnName": "id",
"affinity": "INTEGER",
"notNull": true
},
{
"fieldPath": "serviceId",
"columnName": "serviceId",
"affinity": "INTEGER",
"notNull": true
},
{
"fieldPath": "url",
"columnName": "url",
"affinity": "TEXT",
"notNull": true
},
{
"fieldPath": "displayName",
"columnName": "displayName",
"affinity": "TEXT"
}
],
"primaryKey": {
"autoGenerate": true,
"columnNames": [
"id"
]
},
"indices": [
{
"name": "index_principal_serviceId_url",
"unique": true,
"columnNames": [
"serviceId",
"url"
],
"orders": [],
"createSql": "CREATE UNIQUE INDEX IF NOT EXISTS `index_principal_serviceId_url` ON `${TABLE_NAME}` (`serviceId`, `url`)"
}
],
"foreignKeys": [
{
"table": "service",
"onDelete": "CASCADE",
"onUpdate": "NO ACTION",
"columns": [
"serviceId"
],
"referencedColumns": [
"id"
]
}
]
},
{
"tableName": "syncstats",
"createSql": "CREATE TABLE IF NOT EXISTS `${TABLE_NAME}` (`id` INTEGER PRIMARY KEY AUTOINCREMENT NOT NULL, `collectionId` INTEGER NOT NULL, `authority` TEXT NOT NULL, `lastSync` INTEGER NOT NULL, FOREIGN KEY(`collectionId`) REFERENCES `collection`(`id`) ON UPDATE NO ACTION ON DELETE CASCADE )",
"fields": [
{
"fieldPath": "id",
"columnName": "id",
"affinity": "INTEGER",
"notNull": true
},
{
"fieldPath": "collectionId",
"columnName": "collectionId",
"affinity": "INTEGER",
"notNull": true
},
{
"fieldPath": "authority",
"columnName": "authority",
"affinity": "TEXT",
"notNull": true
},
{
"fieldPath": "lastSync",
"columnName": "lastSync",
"affinity": "INTEGER",
"notNull": true
}
],
"primaryKey": {
"autoGenerate": true,
"columnNames": [
"id"
]
},
"indices": [
{
"name": "index_syncstats_collectionId_authority",
"unique": true,
"columnNames": [
"collectionId",
"authority"
],
"orders": [],
"createSql": "CREATE UNIQUE INDEX IF NOT EXISTS `index_syncstats_collectionId_authority` ON `${TABLE_NAME}` (`collectionId`, `authority`)"
}
],
"foreignKeys": [
{
"table": "collection",
"onDelete": "CASCADE",
"onUpdate": "NO ACTION",
"columns": [
"collectionId"
],
"referencedColumns": [
"id"
]
}
]
},
{
"tableName": "webdav_document",
"createSql": "CREATE TABLE IF NOT EXISTS `${TABLE_NAME}` (`id` INTEGER PRIMARY KEY AUTOINCREMENT NOT NULL, `mountId` INTEGER NOT NULL, `parentId` INTEGER, `name` TEXT NOT NULL, `isDirectory` INTEGER NOT NULL, `displayName` TEXT, `mimeType` TEXT, `eTag` TEXT, `lastModified` INTEGER, `size` INTEGER, `mayBind` INTEGER, `mayUnbind` INTEGER, `mayWriteContent` INTEGER, `quotaAvailable` INTEGER, `quotaUsed` INTEGER, FOREIGN KEY(`mountId`) REFERENCES `webdav_mount`(`id`) ON UPDATE NO ACTION ON DELETE CASCADE , FOREIGN KEY(`parentId`) REFERENCES `webdav_document`(`id`) ON UPDATE NO ACTION ON DELETE CASCADE )",
"fields": [
{
"fieldPath": "id",
"columnName": "id",
"affinity": "INTEGER",
"notNull": true
},
{
"fieldPath": "mountId",
"columnName": "mountId",
"affinity": "INTEGER",
"notNull": true
},
{
"fieldPath": "parentId",
"columnName": "parentId",
"affinity": "INTEGER"
},
{
"fieldPath": "name",
"columnName": "name",
"affinity": "TEXT",
"notNull": true
},
{
"fieldPath": "isDirectory",
"columnName": "isDirectory",
"affinity": "INTEGER",
"notNull": true
},
{
"fieldPath": "displayName",
"columnName": "displayName",
"affinity": "TEXT"
},
{
"fieldPath": "mimeType",
"columnName": "mimeType",
"affinity": "TEXT"
},
{
"fieldPath": "eTag",
"columnName": "eTag",
"affinity": "TEXT"
},
{
"fieldPath": "lastModified",
"columnName": "lastModified",
"affinity": "INTEGER"
},
{
"fieldPath": "size",
"columnName": "size",
"affinity": "INTEGER"
},
{
"fieldPath": "mayBind",
"columnName": "mayBind",
"affinity": "INTEGER"
},
{
"fieldPath": "mayUnbind",
"columnName": "mayUnbind",
"affinity": "INTEGER"
},
{
"fieldPath": "mayWriteContent",
"columnName": "mayWriteContent",
"affinity": "INTEGER"
},
{
"fieldPath": "quotaAvailable",
"columnName": "quotaAvailable",
"affinity": "INTEGER"
},
{
"fieldPath": "quotaUsed",
"columnName": "quotaUsed",
"affinity": "INTEGER"
}
],
"primaryKey": {
"autoGenerate": true,
"columnNames": [
"id"
]
},
"indices": [
{
"name": "index_webdav_document_mountId_parentId_name",
"unique": true,
"columnNames": [
"mountId",
"parentId",
"name"
],
"orders": [],
"createSql": "CREATE UNIQUE INDEX IF NOT EXISTS `index_webdav_document_mountId_parentId_name` ON `${TABLE_NAME}` (`mountId`, `parentId`, `name`)"
},
{
"name": "index_webdav_document_parentId",
"unique": false,
"columnNames": [
"parentId"
],
"orders": [],
"createSql": "CREATE INDEX IF NOT EXISTS `index_webdav_document_parentId` ON `${TABLE_NAME}` (`parentId`)"
}
],
"foreignKeys": [
{
"table": "webdav_mount",
"onDelete": "CASCADE",
"onUpdate": "NO ACTION",
"columns": [
"mountId"
],
"referencedColumns": [
"id"
]
},
{
"table": "webdav_document",
"onDelete": "CASCADE",
"onUpdate": "NO ACTION",
"columns": [
"parentId"
],
"referencedColumns": [
"id"
]
}
]
},
{
"tableName": "webdav_mount",
"createSql": "CREATE TABLE IF NOT EXISTS `${TABLE_NAME}` (`id` INTEGER PRIMARY KEY AUTOINCREMENT NOT NULL, `name` TEXT NOT NULL, `url` TEXT NOT NULL)",
"fields": [
{
"fieldPath": "id",
"columnName": "id",
"affinity": "INTEGER",
"notNull": true
},
{
"fieldPath": "name",
"columnName": "name",
"affinity": "TEXT",
"notNull": true
},
{
"fieldPath": "url",
"columnName": "url",
"affinity": "TEXT",
"notNull": true
}
],
"primaryKey": {
"autoGenerate": true,
"columnNames": [
"id"
]
}
}
],
"setupQueries": [
"CREATE TABLE IF NOT EXISTS room_master_table (id INTEGER PRIMARY KEY,identity_hash TEXT)",
"INSERT OR REPLACE INTO room_master_table (id,identity_hash) VALUES(42, 'cd15d368408570cc2e57252816869de2')"
]
}
}

View file

@ -0,0 +1,648 @@
{
"formatVersion": 1,
"database": {
"version": 18,
"identityHash": "6a0f7e1553e1f621ae7913ea14370fd0",
"entities": [
{
"tableName": "service",
"createSql": "CREATE TABLE IF NOT EXISTS `${TABLE_NAME}` (`id` INTEGER PRIMARY KEY AUTOINCREMENT NOT NULL, `accountName` TEXT NOT NULL, `type` TEXT NOT NULL, `principal` TEXT)",
"fields": [
{
"fieldPath": "id",
"columnName": "id",
"affinity": "INTEGER",
"notNull": true
},
{
"fieldPath": "accountName",
"columnName": "accountName",
"affinity": "TEXT",
"notNull": true
},
{
"fieldPath": "type",
"columnName": "type",
"affinity": "TEXT",
"notNull": true
},
{
"fieldPath": "principal",
"columnName": "principal",
"affinity": "TEXT"
}
],
"primaryKey": {
"autoGenerate": true,
"columnNames": [
"id"
]
},
"indices": [
{
"name": "index_service_accountName_type",
"unique": true,
"columnNames": [
"accountName",
"type"
],
"orders": [],
"createSql": "CREATE UNIQUE INDEX IF NOT EXISTS `index_service_accountName_type` ON `${TABLE_NAME}` (`accountName`, `type`)"
}
]
},
{
"tableName": "homeset",
"createSql": "CREATE TABLE IF NOT EXISTS `${TABLE_NAME}` (`id` INTEGER PRIMARY KEY AUTOINCREMENT NOT NULL, `serviceId` INTEGER NOT NULL, `personal` INTEGER NOT NULL, `url` TEXT NOT NULL, `privBind` INTEGER NOT NULL, `displayName` TEXT, FOREIGN KEY(`serviceId`) REFERENCES `service`(`id`) ON UPDATE NO ACTION ON DELETE CASCADE )",
"fields": [
{
"fieldPath": "id",
"columnName": "id",
"affinity": "INTEGER",
"notNull": true
},
{
"fieldPath": "serviceId",
"columnName": "serviceId",
"affinity": "INTEGER",
"notNull": true
},
{
"fieldPath": "personal",
"columnName": "personal",
"affinity": "INTEGER",
"notNull": true
},
{
"fieldPath": "url",
"columnName": "url",
"affinity": "TEXT",
"notNull": true
},
{
"fieldPath": "privBind",
"columnName": "privBind",
"affinity": "INTEGER",
"notNull": true
},
{
"fieldPath": "displayName",
"columnName": "displayName",
"affinity": "TEXT"
}
],
"primaryKey": {
"autoGenerate": true,
"columnNames": [
"id"
]
},
"indices": [
{
"name": "index_homeset_serviceId_url",
"unique": true,
"columnNames": [
"serviceId",
"url"
],
"orders": [],
"createSql": "CREATE UNIQUE INDEX IF NOT EXISTS `index_homeset_serviceId_url` ON `${TABLE_NAME}` (`serviceId`, `url`)"
}
],
"foreignKeys": [
{
"table": "service",
"onDelete": "CASCADE",
"onUpdate": "NO ACTION",
"columns": [
"serviceId"
],
"referencedColumns": [
"id"
]
}
]
},
{
"tableName": "collection",
"createSql": "CREATE TABLE IF NOT EXISTS `${TABLE_NAME}` (`id` INTEGER PRIMARY KEY AUTOINCREMENT NOT NULL, `serviceId` INTEGER NOT NULL, `homeSetId` INTEGER, `ownerId` INTEGER, `type` TEXT NOT NULL, `url` TEXT NOT NULL, `privWriteContent` INTEGER NOT NULL, `privUnbind` INTEGER NOT NULL, `forceReadOnly` INTEGER NOT NULL, `displayName` TEXT, `description` TEXT, `color` INTEGER, `timezoneId` TEXT, `supportsVEVENT` INTEGER, `supportsVTODO` INTEGER, `supportsVJOURNAL` INTEGER, `source` TEXT, `sync` INTEGER NOT NULL, `pushTopic` TEXT, `supportsWebPush` INTEGER NOT NULL DEFAULT 0, `pushVapidKey` TEXT, `pushSubscription` TEXT, `pushSubscriptionExpires` INTEGER, `pushSubscriptionCreated` INTEGER, FOREIGN KEY(`serviceId`) REFERENCES `service`(`id`) ON UPDATE NO ACTION ON DELETE CASCADE , FOREIGN KEY(`homeSetId`) REFERENCES `homeset`(`id`) ON UPDATE NO ACTION ON DELETE SET NULL , FOREIGN KEY(`ownerId`) REFERENCES `principal`(`id`) ON UPDATE NO ACTION ON DELETE SET NULL )",
"fields": [
{
"fieldPath": "id",
"columnName": "id",
"affinity": "INTEGER",
"notNull": true
},
{
"fieldPath": "serviceId",
"columnName": "serviceId",
"affinity": "INTEGER",
"notNull": true
},
{
"fieldPath": "homeSetId",
"columnName": "homeSetId",
"affinity": "INTEGER"
},
{
"fieldPath": "ownerId",
"columnName": "ownerId",
"affinity": "INTEGER"
},
{
"fieldPath": "type",
"columnName": "type",
"affinity": "TEXT",
"notNull": true
},
{
"fieldPath": "url",
"columnName": "url",
"affinity": "TEXT",
"notNull": true
},
{
"fieldPath": "privWriteContent",
"columnName": "privWriteContent",
"affinity": "INTEGER",
"notNull": true
},
{
"fieldPath": "privUnbind",
"columnName": "privUnbind",
"affinity": "INTEGER",
"notNull": true
},
{
"fieldPath": "forceReadOnly",
"columnName": "forceReadOnly",
"affinity": "INTEGER",
"notNull": true
},
{
"fieldPath": "displayName",
"columnName": "displayName",
"affinity": "TEXT"
},
{
"fieldPath": "description",
"columnName": "description",
"affinity": "TEXT"
},
{
"fieldPath": "color",
"columnName": "color",
"affinity": "INTEGER"
},
{
"fieldPath": "timezoneId",
"columnName": "timezoneId",
"affinity": "TEXT"
},
{
"fieldPath": "supportsVEVENT",
"columnName": "supportsVEVENT",
"affinity": "INTEGER"
},
{
"fieldPath": "supportsVTODO",
"columnName": "supportsVTODO",
"affinity": "INTEGER"
},
{
"fieldPath": "supportsVJOURNAL",
"columnName": "supportsVJOURNAL",
"affinity": "INTEGER"
},
{
"fieldPath": "source",
"columnName": "source",
"affinity": "TEXT"
},
{
"fieldPath": "sync",
"columnName": "sync",
"affinity": "INTEGER",
"notNull": true
},
{
"fieldPath": "pushTopic",
"columnName": "pushTopic",
"affinity": "TEXT"
},
{
"fieldPath": "supportsWebPush",
"columnName": "supportsWebPush",
"affinity": "INTEGER",
"notNull": true,
"defaultValue": "0"
},
{
"fieldPath": "pushVapidKey",
"columnName": "pushVapidKey",
"affinity": "TEXT"
},
{
"fieldPath": "pushSubscription",
"columnName": "pushSubscription",
"affinity": "TEXT"
},
{
"fieldPath": "pushSubscriptionExpires",
"columnName": "pushSubscriptionExpires",
"affinity": "INTEGER"
},
{
"fieldPath": "pushSubscriptionCreated",
"columnName": "pushSubscriptionCreated",
"affinity": "INTEGER"
}
],
"primaryKey": {
"autoGenerate": true,
"columnNames": [
"id"
]
},
"indices": [
{
"name": "index_collection_serviceId_type",
"unique": false,
"columnNames": [
"serviceId",
"type"
],
"orders": [],
"createSql": "CREATE INDEX IF NOT EXISTS `index_collection_serviceId_type` ON `${TABLE_NAME}` (`serviceId`, `type`)"
},
{
"name": "index_collection_homeSetId_type",
"unique": false,
"columnNames": [
"homeSetId",
"type"
],
"orders": [],
"createSql": "CREATE INDEX IF NOT EXISTS `index_collection_homeSetId_type` ON `${TABLE_NAME}` (`homeSetId`, `type`)"
},
{
"name": "index_collection_ownerId_type",
"unique": false,
"columnNames": [
"ownerId",
"type"
],
"orders": [],
"createSql": "CREATE INDEX IF NOT EXISTS `index_collection_ownerId_type` ON `${TABLE_NAME}` (`ownerId`, `type`)"
},
{
"name": "index_collection_pushTopic_type",
"unique": false,
"columnNames": [
"pushTopic",
"type"
],
"orders": [],
"createSql": "CREATE INDEX IF NOT EXISTS `index_collection_pushTopic_type` ON `${TABLE_NAME}` (`pushTopic`, `type`)"
},
{
"name": "index_collection_url",
"unique": false,
"columnNames": [
"url"
],
"orders": [],
"createSql": "CREATE INDEX IF NOT EXISTS `index_collection_url` ON `${TABLE_NAME}` (`url`)"
}
],
"foreignKeys": [
{
"table": "service",
"onDelete": "CASCADE",
"onUpdate": "NO ACTION",
"columns": [
"serviceId"
],
"referencedColumns": [
"id"
]
},
{
"table": "homeset",
"onDelete": "SET NULL",
"onUpdate": "NO ACTION",
"columns": [
"homeSetId"
],
"referencedColumns": [
"id"
]
},
{
"table": "principal",
"onDelete": "SET NULL",
"onUpdate": "NO ACTION",
"columns": [
"ownerId"
],
"referencedColumns": [
"id"
]
}
]
},
{
"tableName": "principal",
"createSql": "CREATE TABLE IF NOT EXISTS `${TABLE_NAME}` (`id` INTEGER PRIMARY KEY AUTOINCREMENT NOT NULL, `serviceId` INTEGER NOT NULL, `url` TEXT NOT NULL, `displayName` TEXT, FOREIGN KEY(`serviceId`) REFERENCES `service`(`id`) ON UPDATE NO ACTION ON DELETE CASCADE )",
"fields": [
{
"fieldPath": "id",
"columnName": "id",
"affinity": "INTEGER",
"notNull": true
},
{
"fieldPath": "serviceId",
"columnName": "serviceId",
"affinity": "INTEGER",
"notNull": true
},
{
"fieldPath": "url",
"columnName": "url",
"affinity": "TEXT",
"notNull": true
},
{
"fieldPath": "displayName",
"columnName": "displayName",
"affinity": "TEXT"
}
],
"primaryKey": {
"autoGenerate": true,
"columnNames": [
"id"
]
},
"indices": [
{
"name": "index_principal_serviceId_url",
"unique": true,
"columnNames": [
"serviceId",
"url"
],
"orders": [],
"createSql": "CREATE UNIQUE INDEX IF NOT EXISTS `index_principal_serviceId_url` ON `${TABLE_NAME}` (`serviceId`, `url`)"
}
],
"foreignKeys": [
{
"table": "service",
"onDelete": "CASCADE",
"onUpdate": "NO ACTION",
"columns": [
"serviceId"
],
"referencedColumns": [
"id"
]
}
]
},
{
"tableName": "syncstats",
"createSql": "CREATE TABLE IF NOT EXISTS `${TABLE_NAME}` (`id` INTEGER PRIMARY KEY AUTOINCREMENT NOT NULL, `collectionId` INTEGER NOT NULL, `dataType` TEXT NOT NULL, `lastSync` INTEGER NOT NULL, FOREIGN KEY(`collectionId`) REFERENCES `collection`(`id`) ON UPDATE NO ACTION ON DELETE CASCADE )",
"fields": [
{
"fieldPath": "id",
"columnName": "id",
"affinity": "INTEGER",
"notNull": true
},
{
"fieldPath": "collectionId",
"columnName": "collectionId",
"affinity": "INTEGER",
"notNull": true
},
{
"fieldPath": "dataType",
"columnName": "dataType",
"affinity": "TEXT",
"notNull": true
},
{
"fieldPath": "lastSync",
"columnName": "lastSync",
"affinity": "INTEGER",
"notNull": true
}
],
"primaryKey": {
"autoGenerate": true,
"columnNames": [
"id"
]
},
"indices": [
{
"name": "index_syncstats_collectionId_dataType",
"unique": true,
"columnNames": [
"collectionId",
"dataType"
],
"orders": [],
"createSql": "CREATE UNIQUE INDEX IF NOT EXISTS `index_syncstats_collectionId_dataType` ON `${TABLE_NAME}` (`collectionId`, `dataType`)"
}
],
"foreignKeys": [
{
"table": "collection",
"onDelete": "CASCADE",
"onUpdate": "NO ACTION",
"columns": [
"collectionId"
],
"referencedColumns": [
"id"
]
}
]
},
{
"tableName": "webdav_document",
"createSql": "CREATE TABLE IF NOT EXISTS `${TABLE_NAME}` (`id` INTEGER PRIMARY KEY AUTOINCREMENT NOT NULL, `mountId` INTEGER NOT NULL, `parentId` INTEGER, `name` TEXT NOT NULL, `isDirectory` INTEGER NOT NULL, `displayName` TEXT, `mimeType` TEXT, `eTag` TEXT, `lastModified` INTEGER, `size` INTEGER, `mayBind` INTEGER, `mayUnbind` INTEGER, `mayWriteContent` INTEGER, `quotaAvailable` INTEGER, `quotaUsed` INTEGER, FOREIGN KEY(`mountId`) REFERENCES `webdav_mount`(`id`) ON UPDATE NO ACTION ON DELETE CASCADE , FOREIGN KEY(`parentId`) REFERENCES `webdav_document`(`id`) ON UPDATE NO ACTION ON DELETE CASCADE )",
"fields": [
{
"fieldPath": "id",
"columnName": "id",
"affinity": "INTEGER",
"notNull": true
},
{
"fieldPath": "mountId",
"columnName": "mountId",
"affinity": "INTEGER",
"notNull": true
},
{
"fieldPath": "parentId",
"columnName": "parentId",
"affinity": "INTEGER"
},
{
"fieldPath": "name",
"columnName": "name",
"affinity": "TEXT",
"notNull": true
},
{
"fieldPath": "isDirectory",
"columnName": "isDirectory",
"affinity": "INTEGER",
"notNull": true
},
{
"fieldPath": "displayName",
"columnName": "displayName",
"affinity": "TEXT"
},
{
"fieldPath": "mimeType",
"columnName": "mimeType",
"affinity": "TEXT"
},
{
"fieldPath": "eTag",
"columnName": "eTag",
"affinity": "TEXT"
},
{
"fieldPath": "lastModified",
"columnName": "lastModified",
"affinity": "INTEGER"
},
{
"fieldPath": "size",
"columnName": "size",
"affinity": "INTEGER"
},
{
"fieldPath": "mayBind",
"columnName": "mayBind",
"affinity": "INTEGER"
},
{
"fieldPath": "mayUnbind",
"columnName": "mayUnbind",
"affinity": "INTEGER"
},
{
"fieldPath": "mayWriteContent",
"columnName": "mayWriteContent",
"affinity": "INTEGER"
},
{
"fieldPath": "quotaAvailable",
"columnName": "quotaAvailable",
"affinity": "INTEGER"
},
{
"fieldPath": "quotaUsed",
"columnName": "quotaUsed",
"affinity": "INTEGER"
}
],
"primaryKey": {
"autoGenerate": true,
"columnNames": [
"id"
]
},
"indices": [
{
"name": "index_webdav_document_mountId_parentId_name",
"unique": true,
"columnNames": [
"mountId",
"parentId",
"name"
],
"orders": [],
"createSql": "CREATE UNIQUE INDEX IF NOT EXISTS `index_webdav_document_mountId_parentId_name` ON `${TABLE_NAME}` (`mountId`, `parentId`, `name`)"
},
{
"name": "index_webdav_document_parentId",
"unique": false,
"columnNames": [
"parentId"
],
"orders": [],
"createSql": "CREATE INDEX IF NOT EXISTS `index_webdav_document_parentId` ON `${TABLE_NAME}` (`parentId`)"
}
],
"foreignKeys": [
{
"table": "webdav_mount",
"onDelete": "CASCADE",
"onUpdate": "NO ACTION",
"columns": [
"mountId"
],
"referencedColumns": [
"id"
]
},
{
"table": "webdav_document",
"onDelete": "CASCADE",
"onUpdate": "NO ACTION",
"columns": [
"parentId"
],
"referencedColumns": [
"id"
]
}
]
},
{
"tableName": "webdav_mount",
"createSql": "CREATE TABLE IF NOT EXISTS `${TABLE_NAME}` (`id` INTEGER PRIMARY KEY AUTOINCREMENT NOT NULL, `name` TEXT NOT NULL, `url` TEXT NOT NULL)",
"fields": [
{
"fieldPath": "id",
"columnName": "id",
"affinity": "INTEGER",
"notNull": true
},
{
"fieldPath": "name",
"columnName": "name",
"affinity": "TEXT",
"notNull": true
},
{
"fieldPath": "url",
"columnName": "url",
"affinity": "TEXT",
"notNull": true
}
],
"primaryKey": {
"autoGenerate": true,
"columnNames": [
"id"
]
}
}
],
"setupQueries": [
"CREATE TABLE IF NOT EXISTS room_master_table (id INTEGER PRIMARY KEY,identity_hash TEXT)",
"INSERT OR REPLACE INTO room_master_table (id,identity_hash) VALUES(42, '6a0f7e1553e1f621ae7913ea14370fd0')"
]
}
}

View file

@ -0,0 +1,298 @@
{
"formatVersion": 1,
"database": {
"version": 8,
"identityHash": "b8699ef3cc4c62e8851df4360fb69e00",
"entities": [
{
"tableName": "service",
"createSql": "CREATE TABLE IF NOT EXISTS `${TABLE_NAME}` (`id` INTEGER PRIMARY KEY AUTOINCREMENT NOT NULL, `accountName` TEXT NOT NULL, `type` TEXT NOT NULL, `principal` TEXT)",
"fields": [
{
"fieldPath": "id",
"columnName": "id",
"affinity": "INTEGER",
"notNull": true
},
{
"fieldPath": "accountName",
"columnName": "accountName",
"affinity": "TEXT",
"notNull": true
},
{
"fieldPath": "type",
"columnName": "type",
"affinity": "TEXT",
"notNull": true
},
{
"fieldPath": "principal",
"columnName": "principal",
"affinity": "TEXT",
"notNull": false
}
],
"primaryKey": {
"columnNames": [
"id"
],
"autoGenerate": true
},
"indices": [
{
"name": "index_service_accountName_type",
"unique": true,
"columnNames": [
"accountName",
"type"
],
"createSql": "CREATE UNIQUE INDEX IF NOT EXISTS `index_service_accountName_type` ON `${TABLE_NAME}` (`accountName`, `type`)"
}
],
"foreignKeys": []
},
{
"tableName": "homeset",
"createSql": "CREATE TABLE IF NOT EXISTS `${TABLE_NAME}` (`id` INTEGER PRIMARY KEY AUTOINCREMENT NOT NULL, `serviceId` INTEGER NOT NULL, `personal` INTEGER NOT NULL, `url` TEXT NOT NULL, `privBind` INTEGER NOT NULL, `displayName` TEXT, FOREIGN KEY(`serviceId`) REFERENCES `service`(`id`) ON UPDATE NO ACTION ON DELETE CASCADE )",
"fields": [
{
"fieldPath": "id",
"columnName": "id",
"affinity": "INTEGER",
"notNull": true
},
{
"fieldPath": "serviceId",
"columnName": "serviceId",
"affinity": "INTEGER",
"notNull": true
},
{
"fieldPath": "personal",
"columnName": "personal",
"affinity": "INTEGER",
"notNull": true
},
{
"fieldPath": "url",
"columnName": "url",
"affinity": "TEXT",
"notNull": true
},
{
"fieldPath": "privBind",
"columnName": "privBind",
"affinity": "INTEGER",
"notNull": true
},
{
"fieldPath": "displayName",
"columnName": "displayName",
"affinity": "TEXT",
"notNull": false
}
],
"primaryKey": {
"columnNames": [
"id"
],
"autoGenerate": true
},
"indices": [
{
"name": "index_homeset_serviceId_url",
"unique": true,
"columnNames": [
"serviceId",
"url"
],
"createSql": "CREATE UNIQUE INDEX IF NOT EXISTS `index_homeset_serviceId_url` ON `${TABLE_NAME}` (`serviceId`, `url`)"
}
],
"foreignKeys": [
{
"table": "service",
"onDelete": "CASCADE",
"onUpdate": "NO ACTION",
"columns": [
"serviceId"
],
"referencedColumns": [
"id"
]
}
]
},
{
"tableName": "collection",
"createSql": "CREATE TABLE IF NOT EXISTS `${TABLE_NAME}` (`id` INTEGER PRIMARY KEY AUTOINCREMENT NOT NULL, `serviceId` INTEGER NOT NULL, `homeSetId` INTEGER, `type` TEXT NOT NULL, `url` TEXT NOT NULL, `privWriteContent` INTEGER NOT NULL, `privUnbind` INTEGER NOT NULL, `forceReadOnly` INTEGER NOT NULL, `displayName` TEXT, `description` TEXT, `owner` TEXT, `color` INTEGER, `timezone` TEXT, `supportsVEVENT` INTEGER, `supportsVTODO` INTEGER, `supportsVJOURNAL` INTEGER, `source` TEXT, `sync` INTEGER NOT NULL, FOREIGN KEY(`serviceId`) REFERENCES `service`(`id`) ON UPDATE NO ACTION ON DELETE CASCADE , FOREIGN KEY(`homeSetId`) REFERENCES `homeset`(`id`) ON UPDATE NO ACTION ON DELETE SET NULL )",
"fields": [
{
"fieldPath": "id",
"columnName": "id",
"affinity": "INTEGER",
"notNull": true
},
{
"fieldPath": "serviceId",
"columnName": "serviceId",
"affinity": "INTEGER",
"notNull": true
},
{
"fieldPath": "homeSetId",
"columnName": "homeSetId",
"affinity": "INTEGER",
"notNull": false
},
{
"fieldPath": "type",
"columnName": "type",
"affinity": "TEXT",
"notNull": true
},
{
"fieldPath": "url",
"columnName": "url",
"affinity": "TEXT",
"notNull": true
},
{
"fieldPath": "privWriteContent",
"columnName": "privWriteContent",
"affinity": "INTEGER",
"notNull": true
},
{
"fieldPath": "privUnbind",
"columnName": "privUnbind",
"affinity": "INTEGER",
"notNull": true
},
{
"fieldPath": "forceReadOnly",
"columnName": "forceReadOnly",
"affinity": "INTEGER",
"notNull": true
},
{
"fieldPath": "displayName",
"columnName": "displayName",
"affinity": "TEXT",
"notNull": false
},
{
"fieldPath": "description",
"columnName": "description",
"affinity": "TEXT",
"notNull": false
},
{
"fieldPath": "owner",
"columnName": "owner",
"affinity": "TEXT",
"notNull": false
},
{
"fieldPath": "color",
"columnName": "color",
"affinity": "INTEGER",
"notNull": false
},
{
"fieldPath": "timezone",
"columnName": "timezone",
"affinity": "TEXT",
"notNull": false
},
{
"fieldPath": "supportsVEVENT",
"columnName": "supportsVEVENT",
"affinity": "INTEGER",
"notNull": false
},
{
"fieldPath": "supportsVTODO",
"columnName": "supportsVTODO",
"affinity": "INTEGER",
"notNull": false
},
{
"fieldPath": "supportsVJOURNAL",
"columnName": "supportsVJOURNAL",
"affinity": "INTEGER",
"notNull": false
},
{
"fieldPath": "source",
"columnName": "source",
"affinity": "TEXT",
"notNull": false
},
{
"fieldPath": "sync",
"columnName": "sync",
"affinity": "INTEGER",
"notNull": true
}
],
"primaryKey": {
"columnNames": [
"id"
],
"autoGenerate": true
},
"indices": [
{
"name": "index_collection_serviceId_type",
"unique": false,
"columnNames": [
"serviceId",
"type"
],
"createSql": "CREATE INDEX IF NOT EXISTS `index_collection_serviceId_type` ON `${TABLE_NAME}` (`serviceId`, `type`)"
},
{
"name": "index_collection_homeSetId_type",
"unique": false,
"columnNames": [
"homeSetId",
"type"
],
"createSql": "CREATE INDEX IF NOT EXISTS `index_collection_homeSetId_type` ON `${TABLE_NAME}` (`homeSetId`, `type`)"
}
],
"foreignKeys": [
{
"table": "service",
"onDelete": "CASCADE",
"onUpdate": "NO ACTION",
"columns": [
"serviceId"
],
"referencedColumns": [
"id"
]
},
{
"table": "homeset",
"onDelete": "SET NULL",
"onUpdate": "NO ACTION",
"columns": [
"homeSetId"
],
"referencedColumns": [
"id"
]
}
]
}
],
"views": [],
"setupQueries": [
"CREATE TABLE IF NOT EXISTS room_master_table (id INTEGER PRIMARY KEY,identity_hash TEXT)",
"INSERT OR REPLACE INTO room_master_table (id,identity_hash) VALUES(42, 'b8699ef3cc4c62e8851df4360fb69e00')"
]
}
}

View file

@ -0,0 +1,366 @@
{
"formatVersion": 1,
"database": {
"version": 9,
"identityHash": "7e4bfdf7f9fa3529c333cf9485f8cf50",
"entities": [
{
"tableName": "service",
"createSql": "CREATE TABLE IF NOT EXISTS `${TABLE_NAME}` (`id` INTEGER PRIMARY KEY AUTOINCREMENT NOT NULL, `accountName` TEXT NOT NULL, `type` TEXT NOT NULL, `principal` TEXT)",
"fields": [
{
"fieldPath": "id",
"columnName": "id",
"affinity": "INTEGER",
"notNull": true
},
{
"fieldPath": "accountName",
"columnName": "accountName",
"affinity": "TEXT",
"notNull": true
},
{
"fieldPath": "type",
"columnName": "type",
"affinity": "TEXT",
"notNull": true
},
{
"fieldPath": "principal",
"columnName": "principal",
"affinity": "TEXT",
"notNull": false
}
],
"primaryKey": {
"columnNames": [
"id"
],
"autoGenerate": true
},
"indices": [
{
"name": "index_service_accountName_type",
"unique": true,
"columnNames": [
"accountName",
"type"
],
"createSql": "CREATE UNIQUE INDEX IF NOT EXISTS `index_service_accountName_type` ON `${TABLE_NAME}` (`accountName`, `type`)"
}
],
"foreignKeys": []
},
{
"tableName": "homeset",
"createSql": "CREATE TABLE IF NOT EXISTS `${TABLE_NAME}` (`id` INTEGER PRIMARY KEY AUTOINCREMENT NOT NULL, `serviceId` INTEGER NOT NULL, `personal` INTEGER NOT NULL, `url` TEXT NOT NULL, `privBind` INTEGER NOT NULL, `displayName` TEXT, FOREIGN KEY(`serviceId`) REFERENCES `service`(`id`) ON UPDATE NO ACTION ON DELETE CASCADE )",
"fields": [
{
"fieldPath": "id",
"columnName": "id",
"affinity": "INTEGER",
"notNull": true
},
{
"fieldPath": "serviceId",
"columnName": "serviceId",
"affinity": "INTEGER",
"notNull": true
},
{
"fieldPath": "personal",
"columnName": "personal",
"affinity": "INTEGER",
"notNull": true
},
{
"fieldPath": "url",
"columnName": "url",
"affinity": "TEXT",
"notNull": true
},
{
"fieldPath": "privBind",
"columnName": "privBind",
"affinity": "INTEGER",
"notNull": true
},
{
"fieldPath": "displayName",
"columnName": "displayName",
"affinity": "TEXT",
"notNull": false
}
],
"primaryKey": {
"columnNames": [
"id"
],
"autoGenerate": true
},
"indices": [
{
"name": "index_homeset_serviceId_url",
"unique": true,
"columnNames": [
"serviceId",
"url"
],
"createSql": "CREATE UNIQUE INDEX IF NOT EXISTS `index_homeset_serviceId_url` ON `${TABLE_NAME}` (`serviceId`, `url`)"
}
],
"foreignKeys": [
{
"table": "service",
"onDelete": "CASCADE",
"onUpdate": "NO ACTION",
"columns": [
"serviceId"
],
"referencedColumns": [
"id"
]
}
]
},
{
"tableName": "collection",
"createSql": "CREATE TABLE IF NOT EXISTS `${TABLE_NAME}` (`id` INTEGER PRIMARY KEY AUTOINCREMENT NOT NULL, `serviceId` INTEGER NOT NULL, `homeSetId` INTEGER, `type` TEXT NOT NULL, `url` TEXT NOT NULL, `privWriteContent` INTEGER NOT NULL, `privUnbind` INTEGER NOT NULL, `forceReadOnly` INTEGER NOT NULL, `displayName` TEXT, `description` TEXT, `owner` TEXT, `color` INTEGER, `timezone` TEXT, `supportsVEVENT` INTEGER, `supportsVTODO` INTEGER, `supportsVJOURNAL` INTEGER, `source` TEXT, `sync` INTEGER NOT NULL, FOREIGN KEY(`serviceId`) REFERENCES `service`(`id`) ON UPDATE NO ACTION ON DELETE CASCADE , FOREIGN KEY(`homeSetId`) REFERENCES `homeset`(`id`) ON UPDATE NO ACTION ON DELETE SET NULL )",
"fields": [
{
"fieldPath": "id",
"columnName": "id",
"affinity": "INTEGER",
"notNull": true
},
{
"fieldPath": "serviceId",
"columnName": "serviceId",
"affinity": "INTEGER",
"notNull": true
},
{
"fieldPath": "homeSetId",
"columnName": "homeSetId",
"affinity": "INTEGER",
"notNull": false
},
{
"fieldPath": "type",
"columnName": "type",
"affinity": "TEXT",
"notNull": true
},
{
"fieldPath": "url",
"columnName": "url",
"affinity": "TEXT",
"notNull": true
},
{
"fieldPath": "privWriteContent",
"columnName": "privWriteContent",
"affinity": "INTEGER",
"notNull": true
},
{
"fieldPath": "privUnbind",
"columnName": "privUnbind",
"affinity": "INTEGER",
"notNull": true
},
{
"fieldPath": "forceReadOnly",
"columnName": "forceReadOnly",
"affinity": "INTEGER",
"notNull": true
},
{
"fieldPath": "displayName",
"columnName": "displayName",
"affinity": "TEXT",
"notNull": false
},
{
"fieldPath": "description",
"columnName": "description",
"affinity": "TEXT",
"notNull": false
},
{
"fieldPath": "owner",
"columnName": "owner",
"affinity": "TEXT",
"notNull": false
},
{
"fieldPath": "color",
"columnName": "color",
"affinity": "INTEGER",
"notNull": false
},
{
"fieldPath": "timezone",
"columnName": "timezone",
"affinity": "TEXT",
"notNull": false
},
{
"fieldPath": "supportsVEVENT",
"columnName": "supportsVEVENT",
"affinity": "INTEGER",
"notNull": false
},
{
"fieldPath": "supportsVTODO",
"columnName": "supportsVTODO",
"affinity": "INTEGER",
"notNull": false
},
{
"fieldPath": "supportsVJOURNAL",
"columnName": "supportsVJOURNAL",
"affinity": "INTEGER",
"notNull": false
},
{
"fieldPath": "source",
"columnName": "source",
"affinity": "TEXT",
"notNull": false
},
{
"fieldPath": "sync",
"columnName": "sync",
"affinity": "INTEGER",
"notNull": true
}
],
"primaryKey": {
"columnNames": [
"id"
],
"autoGenerate": true
},
"indices": [
{
"name": "index_collection_serviceId_type",
"unique": false,
"columnNames": [
"serviceId",
"type"
],
"createSql": "CREATE INDEX IF NOT EXISTS `index_collection_serviceId_type` ON `${TABLE_NAME}` (`serviceId`, `type`)"
},
{
"name": "index_collection_homeSetId_type",
"unique": false,
"columnNames": [
"homeSetId",
"type"
],
"createSql": "CREATE INDEX IF NOT EXISTS `index_collection_homeSetId_type` ON `${TABLE_NAME}` (`homeSetId`, `type`)"
},
{
"name": "index_collection_url",
"unique": false,
"columnNames": [
"url"
],
"createSql": "CREATE INDEX IF NOT EXISTS `index_collection_url` ON `${TABLE_NAME}` (`url`)"
}
],
"foreignKeys": [
{
"table": "service",
"onDelete": "CASCADE",
"onUpdate": "NO ACTION",
"columns": [
"serviceId"
],
"referencedColumns": [
"id"
]
},
{
"table": "homeset",
"onDelete": "SET NULL",
"onUpdate": "NO ACTION",
"columns": [
"homeSetId"
],
"referencedColumns": [
"id"
]
}
]
},
{
"tableName": "syncstats",
"createSql": "CREATE TABLE IF NOT EXISTS `${TABLE_NAME}` (`id` INTEGER PRIMARY KEY AUTOINCREMENT NOT NULL, `collectionId` INTEGER NOT NULL, `authority` TEXT NOT NULL, `lastSync` INTEGER NOT NULL, FOREIGN KEY(`collectionId`) REFERENCES `collection`(`id`) ON UPDATE NO ACTION ON DELETE CASCADE )",
"fields": [
{
"fieldPath": "id",
"columnName": "id",
"affinity": "INTEGER",
"notNull": true
},
{
"fieldPath": "collectionId",
"columnName": "collectionId",
"affinity": "INTEGER",
"notNull": true
},
{
"fieldPath": "authority",
"columnName": "authority",
"affinity": "TEXT",
"notNull": true
},
{
"fieldPath": "lastSync",
"columnName": "lastSync",
"affinity": "INTEGER",
"notNull": true
}
],
"primaryKey": {
"columnNames": [
"id"
],
"autoGenerate": true
},
"indices": [
{
"name": "index_syncstats_collectionId_authority",
"unique": true,
"columnNames": [
"collectionId",
"authority"
],
"createSql": "CREATE UNIQUE INDEX IF NOT EXISTS `index_syncstats_collectionId_authority` ON `${TABLE_NAME}` (`collectionId`, `authority`)"
}
],
"foreignKeys": [
{
"table": "collection",
"onDelete": "CASCADE",
"onUpdate": "NO ACTION",
"columns": [
"collectionId"
],
"referencedColumns": [
"id"
]
}
]
}
],
"views": [],
"setupQueries": [
"CREATE TABLE IF NOT EXISTS room_master_table (id INTEGER PRIMARY KEY,identity_hash TEXT)",
"INSERT OR REPLACE INTO room_master_table (id,identity_hash) VALUES(42, '7e4bfdf7f9fa3529c333cf9485f8cf50')"
]
}
}

1
app/src/.gitignore vendored Normal file
View file

@ -0,0 +1 @@
espressoTest

View file

@ -0,0 +1,10 @@
<?xml version="1.0" encoding="utf-8"?>
<manifest xmlns:android="http://schemas.android.com/apk/res/android"
android:installLocation="internalOnly">
<!-- account management permissions not required for own accounts since API level 22 -->
<uses-permission android:name="android.permission.AUTHENTICATE_ACCOUNTS" android:maxSdkVersion="22"/>
<uses-permission android:name="android.permission.GET_ACCOUNTS" android:maxSdkVersion="22"/>
<uses-permission android:name="android.permission.MANAGE_ACCOUNTS" android:maxSdkVersion="22"/>
</manifest>

View file

@ -0,0 +1,30 @@
/*
* Copyright © All Contributors. See LICENSE and AUTHORS in the root directory for details.
*/
package at.bitfire.davdroid
import android.util.Xml
import at.bitfire.dav4jvm.XmlUtils
import okhttp3.HttpUrl.Companion.toHttpUrl
import org.junit.Assert.assertEquals
import org.junit.Assert.assertTrue
import org.junit.Test
class ExternalLibrariesTest {
@Test
fun test_Dav4jvm_XmlUtils_NewPullParser_RelaxedParsing() {
val parser = XmlUtils.newPullParser()
assertTrue(parser.getFeature(Xml.FEATURE_RELAXED))
}
@Test
fun testOkhttpHttpUrl_PublicSuffixList() {
// HttpUrl.topPrivateDomain() requires okhttp's internal PublicSuffixList.
// In Android, loading the PublicSuffixList is done over AndroidX startup.
// This test verifies that everything is working.
assertEquals("example.com", "http://example.com".toHttpUrl().topPrivateDomain())
}
}

View file

@ -0,0 +1,42 @@
/*
* Copyright © All Contributors. See LICENSE and AUTHORS in the root directory for details.
*/
package at.bitfire.davdroid
import android.app.Application
import android.content.Context
import android.os.Build
import android.os.Bundle
import androidx.test.runner.AndroidJUnitRunner
import at.bitfire.davdroid.di.TestCoroutineDispatchersModule
import at.bitfire.davdroid.test.BuildConfig
import at.bitfire.synctools.log.LogcatHandler
import dagger.hilt.android.testing.HiltTestApplication
import java.util.logging.Level
import java.util.logging.Logger
@Suppress("unused")
class HiltTestRunner : AndroidJUnitRunner() {
override fun newApplication(cl: ClassLoader, name: String, context: Context): Application =
super.newApplication(cl, HiltTestApplication::class.java.name, context)
override fun onCreate(arguments: Bundle?) {
super.onCreate(arguments)
// set root logger to adb Logcat
val rootLogger = Logger.getLogger("")
rootLogger.level = Level.ALL
rootLogger.handlers.forEach { rootLogger.removeHandler(it) }
rootLogger.addHandler(LogcatHandler(BuildConfig.APPLICATION_ID))
// MockK requirements
if (Build.VERSION.SDK_INT < Build.VERSION_CODES.P)
throw AssertionError("MockK requires Android P [https://mockk.io/ANDROID.html]")
// set main dispatcher for tests (especially runTest)
TestCoroutineDispatchersModule.initMainDispatcher()
}
}

View file

@ -0,0 +1,58 @@
/*
* Copyright © All Contributors. See LICENSE and AUTHORS in the root directory for details.
*/
package at.bitfire.davdroid
import android.content.Context
import android.util.Log
import androidx.work.Configuration
import androidx.work.WorkInfo
import androidx.work.WorkManager
import androidx.work.WorkQuery
import androidx.work.WorkerFactory
import androidx.work.testing.WorkManagerTestInitHelper
import org.junit.Assert.assertTrue
import kotlin.math.abs
object TestUtils {
fun assertWithin(expected: Long, actual: Long, tolerance: Long) {
val absDifference = abs(expected - actual)
assertTrue(
"$actual not within ($expected ± $tolerance)",
absDifference <= tolerance
)
}
/**
* Initializes WorkManager for instrumentation tests.
*/
fun setUpWorkManager(context: Context, workerFactory: WorkerFactory? = null) {
val config = Configuration.Builder().setMinimumLoggingLevel(Log.DEBUG)
if (workerFactory != null)
config.setWorkerFactory(workerFactory)
WorkManagerTestInitHelper.initializeTestWorkManager(context, config.build())
}
fun workInStates(context: Context, workerName: String, states: List<WorkInfo.State>): Boolean =
WorkManager.getInstance(context).getWorkInfos(WorkQuery.Builder
.fromUniqueWorkNames(listOf(workerName))
.addStates(states)
.build()
).get().isNotEmpty()
fun workScheduledOrRunning(context: Context, workerName: String): Boolean =
workInStates(context, workerName, listOf(
WorkInfo.State.ENQUEUED,
WorkInfo.State.RUNNING
))
fun workScheduledOrRunningOrSuccessful(context: Context, workerName: String): Boolean =
workInStates(context, workerName, listOf(
WorkInfo.State.ENQUEUED,
WorkInfo.State.RUNNING,
WorkInfo.State.SUCCEEDED
))
}

View file

@ -0,0 +1,79 @@
/*
* Copyright © All Contributors. See LICENSE and AUTHORS in the root directory for details.
*/
package at.bitfire.davdroid.db
import android.content.Context
import androidx.room.Room
import androidx.room.migration.AutoMigrationSpec
import androidx.room.migration.Migration
import androidx.room.testing.MigrationTestHelper
import androidx.sqlite.db.framework.FrameworkSQLiteOpenHelperFactory
import androidx.test.platform.app.InstrumentationRegistry
import dagger.hilt.android.qualifiers.ApplicationContext
import dagger.hilt.android.testing.HiltAndroidRule
import dagger.hilt.android.testing.HiltAndroidTest
import org.junit.Before
import org.junit.Rule
import org.junit.Test
import java.util.logging.Logger
import javax.inject.Inject
@HiltAndroidTest
class AppDatabaseTest {
@get:Rule
val hiltRule = HiltAndroidRule(this)
@Inject @ApplicationContext
lateinit var context: Context
@Inject
lateinit var autoMigrations: Set<@JvmSuppressWildcards AutoMigrationSpec>
@Inject
lateinit var logger: Logger
@Inject
lateinit var manualMigrations: Set<@JvmSuppressWildcards Migration>
@Before
fun setup() {
hiltRule.inject()
}
/**
* Creates a database with schema version 8 (the first exported one) and then migrates it to the latest version.
*/
@Test
fun testAllMigrations() {
// Create DB with v8
MigrationTestHelper(
InstrumentationRegistry.getInstrumentation(),
AppDatabase::class.java,
listOf(), // no auto migrations until v8
FrameworkSQLiteOpenHelperFactory()
).createDatabase(TEST_DB, 8).close()
// open and migrate (to current version) database
Room.databaseBuilder(context, AppDatabase::class.java, TEST_DB)
// manual migrations
.addMigrations(*manualMigrations.toTypedArray())
// auto-migrations that need to be specified explicitly
.apply {
for (spec in autoMigrations)
addAutoMigrationSpec(spec)
}
.build()
.openHelper.writableDatabase // this will run all migrations
.close()
}
companion object {
const val TEST_DB = "test"
}
}

View file

@ -0,0 +1,207 @@
/*
* Copyright © All Contributors. See LICENSE and AUTHORS in the root directory for details.
*/
package at.bitfire.davdroid.db
import android.security.NetworkSecurityPolicy
import androidx.test.filters.SmallTest
import at.bitfire.dav4jvm.DavResource
import at.bitfire.dav4jvm.property.webdav.ResourceType
import at.bitfire.davdroid.network.HttpClient
import dagger.hilt.android.testing.HiltAndroidRule
import dagger.hilt.android.testing.HiltAndroidTest
import okhttp3.HttpUrl.Companion.toHttpUrl
import okhttp3.mockwebserver.MockResponse
import okhttp3.mockwebserver.MockWebServer
import org.junit.After
import org.junit.Assert.assertEquals
import org.junit.Assert.assertFalse
import org.junit.Assert.assertNull
import org.junit.Assert.assertTrue
import org.junit.Assume
import org.junit.Before
import org.junit.Rule
import org.junit.Test
import javax.inject.Inject
@HiltAndroidTest
class CollectionTest {
@Inject
lateinit var httpClientBuilder: HttpClient.Builder
@get:Rule
val hiltRule = HiltAndroidRule(this)
private lateinit var httpClient: HttpClient
private val server = MockWebServer()
@Before
fun setup() {
hiltRule.inject()
httpClient = httpClientBuilder.build()
Assume.assumeTrue(NetworkSecurityPolicy.getInstance().isCleartextTrafficPermitted)
}
@After
fun teardown() {
httpClient.close()
}
@Test
@SmallTest
fun testFromDavResponseAddressBook() {
// r/w address book
server.enqueue(MockResponse()
.setResponseCode(207)
.setBody("<multistatus xmlns='DAV:' xmlns:CARD='urn:ietf:params:xml:ns:carddav'>" +
"<response>" +
" <href>/</href>" +
" <propstat><prop>" +
" <resourcetype><collection/><CARD:addressbook/></resourcetype>" +
" <displayname>My Contacts</displayname>" +
" <CARD:addressbook-description>My Contacts Description</CARD:addressbook-description>" +
" </prop></propstat>" +
"</response>" +
"</multistatus>"))
lateinit var info: Collection
DavResource(httpClient.okHttpClient, server.url("/"))
.propfind(0, ResourceType.NAME) { response, _ ->
info = Collection.fromDavResponse(response) ?: throw IllegalArgumentException()
}
assertEquals(Collection.TYPE_ADDRESSBOOK, info.type)
assertTrue(info.privWriteContent)
assertTrue(info.privUnbind)
assertNull(info.supportsVEVENT)
assertNull(info.supportsVTODO)
assertNull(info.supportsVJOURNAL)
assertEquals("My Contacts", info.displayName)
assertEquals("My Contacts Description", info.description)
}
@Test
@SmallTest
fun testFromDavResponseCalendar_FullTimezone() {
// read-only calendar, no display name
server.enqueue(MockResponse()
.setResponseCode(207)
.setBody("<multistatus xmlns='DAV:' xmlns:CAL='urn:ietf:params:xml:ns:caldav' xmlns:ICAL='http://apple.com/ns/ical/'>" +
"<response>" +
" <href>/</href>" +
" <propstat><prop>" +
" <resourcetype><collection/><CAL:calendar/></resourcetype>" +
" <current-user-privilege-set><privilege><read/></privilege></current-user-privilege-set>" +
" <CAL:calendar-description>My Calendar</CAL:calendar-description>" +
" <CAL:calendar-timezone>BEGIN:VCALENDAR\n" +
"PRODID:-//Example Corp.//CalDAV Client//EN\n" +
"VERSION:2.0\n" +
"BEGIN:VTIMEZONE\n" +
"TZID:US-Eastern\n" +
"LAST-MODIFIED:19870101T000000Z\n" +
"BEGIN:STANDARD\n" +
"DTSTART:19671029T020000\n" +
"RRULE:FREQ=YEARLY;BYDAY=-1SU;BYMONTH=10\n" +
"TZOFFSETFROM:-0400\n" +
"TZOFFSETTO:-0500\n" +
"TZNAME:Eastern Standard Time (US & Canada)\n" +
"END:STANDARD\n" +
"BEGIN:DAYLIGHT\n" +
"DTSTART:19870405T020000\n" +
"RRULE:FREQ=YEARLY;BYDAY=1SU;BYMONTH=4\n" +
"TZOFFSETFROM:-0500\n" +
"TZOFFSETTO:-0400\n" +
"TZNAME:Eastern Daylight Time (US & Canada)\n" +
"END:DAYLIGHT\n" +
"END:VTIMEZONE\n" +
"END:VCALENDAR\n" +
"</CAL:calendar-timezone>" +
" <ICAL:calendar-color>#ff0000</ICAL:calendar-color>" +
" </prop></propstat>" +
"</response>" +
"</multistatus>"))
lateinit var info: Collection
DavResource(httpClient.okHttpClient, server.url("/"))
.propfind(0, ResourceType.NAME) { response, _ ->
info = Collection.fromDavResponse(response)!!
}
assertEquals(Collection.TYPE_CALENDAR, info.type)
assertFalse(info.privWriteContent)
assertFalse(info.privUnbind)
assertNull(info.displayName)
assertEquals("My Calendar", info.description)
assertEquals(0xFFFF0000.toInt(), info.color)
assertEquals("US-Eastern", info.timezoneId)
assertTrue(info.supportsVEVENT!!)
assertTrue(info.supportsVTODO!!)
assertTrue(info.supportsVJOURNAL!!)
}
@Test
@SmallTest
fun testFromDavResponseCalendar_OnlyTzId() {
// read-only calendar, no display name
server.enqueue(MockResponse()
.setResponseCode(207)
.setBody("<multistatus xmlns='DAV:' xmlns:CAL='urn:ietf:params:xml:ns:caldav' xmlns:ICAL='http://apple.com/ns/ical/'>" +
"<response>" +
" <href>/</href>" +
" <propstat><prop>" +
" <resourcetype><collection/><CAL:calendar/></resourcetype>" +
" <current-user-privilege-set><privilege><read/></privilege></current-user-privilege-set>" +
" <CAL:calendar-description>My Calendar</CAL:calendar-description>" +
" <CAL:calendar-timezone-id>US-Eastern</CAL:calendar-timezone-id>" +
" <ICAL:calendar-color>#ff0000</ICAL:calendar-color>" +
" </prop></propstat>" +
"</response>" +
"</multistatus>"))
lateinit var info: Collection
DavResource(httpClient.okHttpClient, server.url("/"))
.propfind(0, ResourceType.NAME) { response, _ ->
info = Collection.fromDavResponse(response)!!
}
assertEquals(Collection.TYPE_CALENDAR, info.type)
assertFalse(info.privWriteContent)
assertFalse(info.privUnbind)
assertNull(info.displayName)
assertEquals("My Calendar", info.description)
assertEquals(0xFFFF0000.toInt(), info.color)
assertEquals("US-Eastern", info.timezoneId)
assertTrue(info.supportsVEVENT!!)
assertTrue(info.supportsVTODO!!)
assertTrue(info.supportsVJOURNAL!!)
}
@Test
@SmallTest
fun testFromDavResponseWebcal() {
// Webcal subscription
server.enqueue(MockResponse()
.setResponseCode(207)
.setBody("<multistatus xmlns='DAV:' xmlns:CS='http://calendarserver.org/ns/'>" +
"<response>" +
" <href>/webcal1</href>" +
" <propstat><prop>" +
" <displayname>Sample Subscription</displayname>" +
" <resourcetype><collection/><CS:subscribed/></resourcetype>" +
" <CS:source><href>webcals://example.com/1.ics</href></CS:source>" +
" </prop></propstat>" +
"</response>" +
"</multistatus>"))
lateinit var info: Collection
DavResource(httpClient.okHttpClient, server.url("/"))
.propfind(0, ResourceType.NAME) { response, _ ->
info = Collection.fromDavResponse(response) ?: throw IllegalArgumentException()
}
assertEquals(Collection.TYPE_WEBCAL, info.type)
assertEquals("Sample Subscription", info.displayName)
assertEquals("https://example.com/1.ics".toHttpUrl(), info.source)
}
}

View file

@ -0,0 +1,97 @@
/*
* Copyright © All Contributors. See LICENSE and AUTHORS in the root directory for details.
*/
package at.bitfire.davdroid.db
import dagger.hilt.android.testing.HiltAndroidRule
import dagger.hilt.android.testing.HiltAndroidTest
import kotlinx.coroutines.Dispatchers
import kotlinx.coroutines.launch
import kotlinx.coroutines.runBlocking
import okhttp3.HttpUrl.Companion.toHttpUrl
import org.junit.After
import org.junit.Assert.assertEquals
import org.junit.Before
import org.junit.Rule
import org.junit.Test
import javax.inject.Inject
@HiltAndroidTest
class HomeSetDaoTest {
@get:Rule
var hiltRule = HiltAndroidRule(this)
@Inject
lateinit var db: AppDatabase
lateinit var dao: HomeSetDao
var serviceId: Long = 0
@Before
fun setUp() {
hiltRule.inject()
dao = db.homeSetDao()
serviceId = db.serviceDao().insertOrReplace(
Service(id=0, accountName="test", type= Service.TYPE_CALDAV, principal = null)
)
}
@After
fun tearDown() {
db.serviceDao().deleteAll()
}
@Test
fun testInsertOrUpdate() {
// should insert new row or update (upsert) existing row - without changing its key!
val entry1 = HomeSet(id=0, serviceId=serviceId, personal=true, url="https://example.com/1".toHttpUrl())
val insertId1 = dao.insertOrUpdateByUrlBlocking(entry1)
assertEquals(1L, insertId1)
assertEquals(entry1.copy(id = 1L), dao.getById(1))
val updatedEntry1 = HomeSet(id=0, serviceId=serviceId, personal=true, url="https://example.com/1".toHttpUrl(), displayName="Updated Entry")
val updateId1 = dao.insertOrUpdateByUrlBlocking(updatedEntry1)
assertEquals(1L, updateId1)
assertEquals(updatedEntry1.copy(id = 1L), dao.getById(1))
val entry2 = HomeSet(id=0, serviceId=serviceId, personal=true, url= "https://example.com/2".toHttpUrl())
val insertId2 = dao.insertOrUpdateByUrlBlocking(entry2)
assertEquals(2L, insertId2)
assertEquals(entry2.copy(id = 2L), dao.getById(2))
}
@Test
fun testInsertOrUpdate_TransactionSafe() {
runBlocking(Dispatchers.IO) {
for (i in 0..9999)
launch {
dao.insertOrUpdateByUrlBlocking(
HomeSet(
id = 0,
serviceId = serviceId,
url = "https://example.com/".toHttpUrl(),
personal = true
)
)
}
}
assertEquals(1, dao.getByService(serviceId).size)
}
@Test
fun testDelete() {
// should delete row with given primary key (id)
val entry1 = HomeSet(id=1, serviceId=serviceId, personal=true, url= "https://example.com/1".toHttpUrl())
val insertId1 = dao.insertOrUpdateByUrlBlocking(entry1)
assertEquals(1L, insertId1)
assertEquals(entry1, dao.getById(1L))
dao.delete(entry1)
assertEquals(null, dao.getById(1L))
}
}

View file

@ -0,0 +1,41 @@
/*
* Copyright © All Contributors. See LICENSE and AUTHORS in the root directory for details.
*/
package at.bitfire.davdroid.db
import android.content.Context
import androidx.room.Room
import androidx.room.migration.AutoMigrationSpec
import dagger.Module
import dagger.Provides
import dagger.hilt.android.qualifiers.ApplicationContext
import dagger.hilt.components.SingletonComponent
import dagger.hilt.testing.TestInstallIn
import javax.inject.Singleton
@Module
@TestInstallIn(
components = [ SingletonComponent::class ],
replaces = [
AppDatabase.AppDatabaseModule::class
]
)
class MemoryDbModule {
@Provides
@Singleton
fun inMemoryDatabase(
autoMigrations: Set<@JvmSuppressWildcards AutoMigrationSpec>,
@ApplicationContext context: Context
): AppDatabase =
Room.inMemoryDatabaseBuilder(context, AppDatabase::class.java)
// auto-migration specs that need to be specified explicitly
.apply {
for (spec in autoMigrations) {
addAutoMigrationSpec(spec)
}
}
.build()
}

View file

@ -0,0 +1,96 @@
/*
* Copyright © All Contributors. See LICENSE and AUTHORS in the root directory for details.
*/
package at.bitfire.davdroid.db
import android.database.sqlite.SQLiteConstraintException
import dagger.hilt.android.testing.HiltAndroidRule
import dagger.hilt.android.testing.HiltAndroidTest
import io.mockk.junit4.MockKRule
import io.mockk.spyk
import io.mockk.verify
import kotlinx.coroutines.test.runTest
import okhttp3.HttpUrl.Companion.toHttpUrl
import org.junit.Assert.assertEquals
import org.junit.Assert.assertTrue
import org.junit.Before
import org.junit.Rule
import org.junit.Test
import javax.inject.Inject
@HiltAndroidTest
class PrincipalDaoTest {
@get:Rule
val hiltRule = HiltAndroidRule(this)
@get:Rule
val mockKRule = MockKRule(this)
@Inject
lateinit var db: AppDatabase
private lateinit var principalDao: PrincipalDao
private lateinit var service: Service
private val url = "https://example.com/dav/principal".toHttpUrl()
@Before
fun setUp() {
hiltRule.inject()
principalDao = spyk(db.principalDao())
service = Service(id = 1, accountName = "account", type = "webdav")
db.serviceDao().insertOrReplace(service)
}
@Test
fun insertOrUpdate_insertsIfNotExisting() = runTest {
val principal = Principal(serviceId = service.id, url = url, displayName = "principal")
val id = principalDao.insertOrUpdate(service.id, principal)
assertTrue(id > 0)
val stored = principalDao.get(id)
assertEquals("principal", stored.displayName)
verify(exactly = 0) { principalDao.update(any()) }
}
@Test
fun insertOrUpdate_doesNotUpdateIfDisplayNameIsEqual() = runTest {
val principalOld = Principal(serviceId = service.id, url = url, displayName = "principalOld")
val idOld = principalDao.insertOrUpdate(service.id, principalOld)
val principalNew = Principal(serviceId = service.id, url = url, displayName = "principalOld")
val idNew = principalDao.insertOrUpdate(service.id, principalNew)
assertEquals(idOld, idNew)
val stored = principalDao.get(idOld)
assertEquals("principalOld", stored.displayName)
verify(exactly = 0) { principalDao.update(any()) }
}
@Test
fun insertOrUpdate_updatesIfDisplayNameIsDifferent() = runTest {
val principalOld = Principal(serviceId = service.id, url = url, displayName = "principalOld")
val idOld = principalDao.insertOrUpdate(service.id, principalOld)
val principalNew = Principal(serviceId = service.id, url = url, displayName = "principalNew")
val idNew = principalDao.insertOrUpdate(service.id, principalNew)
assertEquals(idOld, idNew)
val updated = principalDao.get(idOld)
assertEquals("principalNew", updated.displayName)
verify(exactly = 1) { principalDao.update(any()) }
}
@Test(expected = SQLiteConstraintException::class)
fun insertOrUpdate_throwsForeignKeyConstraintViolationException() = runTest {
// throws on non-existing service
val url = "https://example.com/dav/principal".toHttpUrl()
val principal1 = Principal(serviceId = 999, url = url, displayName = "p1")
principalDao.insertOrUpdate(999, principal1)
}
}

View file

@ -0,0 +1,77 @@
/*
* Copyright © All Contributors. See LICENSE and AUTHORS in the root directory for details.
*/
package at.bitfire.davdroid.db
import androidx.sqlite.SQLiteException
import at.bitfire.davdroid.sync.SyncDataType
import dagger.hilt.android.testing.HiltAndroidRule
import dagger.hilt.android.testing.HiltAndroidTest
import kotlinx.coroutines.test.runTest
import okhttp3.HttpUrl.Companion.toHttpUrl
import org.junit.After
import org.junit.Before
import org.junit.Rule
import org.junit.Test
import javax.inject.Inject
@HiltAndroidTest
class SyncStatsDaoTest {
@get:Rule
val hiltRule = HiltAndroidRule(this)
@Inject
lateinit var db: AppDatabase
var collectionId: Long = 0
@Before
fun setUp() {
hiltRule.inject()
val serviceId = db.serviceDao().insertOrReplace(Service(
id = 0,
accountName = "test@example.com",
type = Service.TYPE_CALDAV
))
collectionId = db.collectionDao().insert(Collection(
id = 0,
serviceId = serviceId,
type = Collection.TYPE_CALENDAR,
url = "https://example.com".toHttpUrl()
))
}
@After
fun tearDown() {
db.serviceDao().deleteAll()
}
@Test
fun testInsertOrReplace_ExistingForeignKey() = runTest {
val dao = db.syncStatsDao()
dao.insertOrReplace(
SyncStats(
id = 0,
collectionId = collectionId,
dataType = SyncDataType.CONTACTS.toString(),
lastSync = System.currentTimeMillis()
)
)
}
@Test(expected = SQLiteException::class)
fun testInsertOrReplace_MissingForeignKey() = runTest {
val dao = db.syncStatsDao()
dao.insertOrReplace(
SyncStats(
id = 0,
collectionId = 12345,
dataType = SyncDataType.CONTACTS.toString(),
lastSync = System.currentTimeMillis()
)
)
}
}

View file

@ -0,0 +1,80 @@
/*
* Copyright © All Contributors. See LICENSE and AUTHORS in the root directory for details.
*/
package at.bitfire.davdroid.db
import dagger.hilt.android.testing.HiltAndroidRule
import dagger.hilt.android.testing.HiltAndroidTest
import kotlinx.coroutines.test.runTest
import okhttp3.HttpUrl.Companion.toHttpUrl
import org.junit.Assert.assertEquals
import org.junit.Before
import org.junit.Rule
import org.junit.Test
import java.util.logging.Level
import java.util.logging.Logger
import javax.inject.Inject
@HiltAndroidTest
class WebDavDocumentDaoTest {
@get:Rule
val hiltRule = HiltAndroidRule(this)
@Inject
lateinit var db: AppDatabase
@Inject
lateinit var logger: Logger
@Before
fun setUp() {
hiltRule.inject()
}
@Test
fun testGetChildren() = runTest {
val mountDao = db.webDavMountDao()
val dao = db.webDavDocumentDao()
val mount = WebDavMount(id = 1, name = "Test", url = "https://example.com/".toHttpUrl())
db.webDavMountDao().insert(mount)
val root = WebDavDocument(
id = 1,
mountId = mount.id,
parentId = null,
name = "Root Document"
)
dao.insertOrReplace(root)
dao.insertOrReplace(WebDavDocument(id = 0, mountId = mount.id, parentId = root.id, name = "Name 1", displayName = "DisplayName 2"))
dao.insertOrReplace(WebDavDocument(id = 0, mountId = mount.id, parentId = root.id, name = "Name 2", displayName = "DisplayName 1"))
dao.insertOrReplace(WebDavDocument(id = 0, mountId = mount.id, parentId = root.id, name = "Name 3", displayName = "Directory 1", isDirectory = true))
try {
dao.getChildren(root.id, orderBy = "name DESC").let { result ->
logger.log(Level.INFO, "getChildren single sort Result", result)
assertEquals(listOf(
"Name 3",
"Name 2",
"Name 1"
), result.map { it.name })
}
dao.getChildren(root.id, orderBy = "isDirectory DESC, name ASC").let { result ->
logger.log(Level.INFO, "getChildren multiple sort Result", result)
assertEquals(listOf(
"Name 3",
"Name 1",
"Name 2"
), result.map { it.name })
}
} finally {
mountDao.deleteAsync(mount)
}
}
}

View file

@ -0,0 +1,83 @@
/*
* Copyright © All Contributors. See LICENSE and AUTHORS in the root directory for details.
*/
package at.bitfire.davdroid.db.migration
import at.bitfire.davdroid.db.Collection.Companion.TYPE_CALENDAR
import at.bitfire.davdroid.db.Service
import dagger.hilt.android.testing.HiltAndroidTest
import org.junit.Assert.assertEquals
import org.junit.Assert.assertNull
import org.junit.Test
@HiltAndroidTest
class AutoMigration16Test: DatabaseMigrationTest(toVersion = 16) {
@Test
fun testMigrate_WithTimeZone() = testMigration(
prepare = { db ->
val minimalVTimezone = """
BEGIN:VCALENDAR
VERSION:2.0
PRODID:DAVx5
BEGIN:VTIMEZONE
TZID:America/New_York
END:VTIMEZONE
END:VCALENDAR
""".trimIndent()
db.execSQL(
"INSERT INTO service (id, accountName, type) VALUES (?, ?, ?)",
arrayOf(1, "test", Service.Companion.TYPE_CALDAV)
)
db.execSQL(
"INSERT INTO collection (id, serviceId, type, url, privWriteContent, privUnbind, forceReadOnly, sync, timezone) VALUES (?, ?, ?, ?, ?, ?, ?, ?, ?)",
arrayOf(1, 1, TYPE_CALENDAR, "https://example.com", true, true, false, false, minimalVTimezone)
)
}
) { db ->
db.query("SELECT timezoneId FROM collection WHERE id=1").use { cursor ->
cursor.moveToFirst()
assertEquals("America/New_York", cursor.getString(0))
}
}
@Test
fun testMigrate_WithTimeZone_Unparseable() = testMigration(
prepare = { db ->
db.execSQL(
"INSERT INTO service (id, accountName, type) VALUES (?, ?, ?)",
arrayOf(1, "test", Service.Companion.TYPE_CALDAV)
)
db.execSQL(
"INSERT INTO collection (id, serviceId, type, url, privWriteContent, privUnbind, forceReadOnly, sync, timezone) VALUES (?, ?, ?, ?, ?, ?, ?, ?, ?)",
arrayOf(1, 1, TYPE_CALENDAR, "https://example.com", true, true, false, false, "Some Garbage Content")
)
}
) { db ->
db.query("SELECT timezoneId FROM collection WHERE id=1").use { cursor ->
cursor.moveToFirst()
assertNull(cursor.getString(0))
}
}
@Test
fun testMigrate_WithoutTimezone() = testMigration(
prepare = { db ->
db.execSQL(
"INSERT INTO service (id, accountName, type) VALUES (?, ?, ?)",
arrayOf(1, "test", Service.Companion.TYPE_CALDAV)
)
db.execSQL(
"INSERT INTO collection (id, serviceId, type, url, privWriteContent, privUnbind, forceReadOnly, sync) VALUES (?, ?, ?, ?, ?, ?, ?, ?)",
arrayOf(1, 1, TYPE_CALENDAR, "https://example.com", true, true, false, false)
)
}
) { db ->
db.query("SELECT timezoneId FROM collection WHERE id=1").use { cursor ->
cursor.moveToFirst()
assertNull(cursor.getString(0))
}
}
}

View file

@ -0,0 +1,79 @@
/*
* Copyright © All Contributors. See LICENSE and AUTHORS in the root directory for details.
*/
package at.bitfire.davdroid.db.migration
import dagger.hilt.android.testing.HiltAndroidTest
import org.junit.Assert.assertEquals
import org.junit.Test
@HiltAndroidTest
class AutoMigration18Test : DatabaseMigrationTest(toVersion = 18) {
@Test
fun testMigration_AllAuthorities() = testMigration(
prepare = { db ->
// Insert service and collection to respect relation constraints
db.execSQL("INSERT INTO service (id, accountName, type) VALUES (?, ?, ?)", arrayOf<Any?>(1, "test", 1))
listOf(1L, 2L, 3L).forEach { id ->
db.execSQL(
"INSERT INTO collection (id, serviceId, url, type, privWriteContent, privUnbind, forceReadOnly, sync) VALUES (?, ?, ?, ?, ?, ?, ?, ?)",
arrayOf<Any?>(id, 1, "https://example.com/$id", 1, 1, 1, 0, 1)
)
}
// Insert some syncstats with authorities and lastSync times
val syncstats = listOf(
Entry(1, 1, "com.android.contacts", 1000),
Entry(2, 1, "com.android.calendar", 1000),
Entry(3, 1, "org.dmfs.tasks", 1000),
Entry(4, 1, "org.tasks.opentasks", 2000),
Entry(5, 1, "at.techbee.jtx.provider", 3000), // highest lastSync for collection 1
Entry(6, 1, "unknown.authority", 1000), // ignored
Entry(7, 2, "org.dmfs.tasks", 1000),
Entry(8, 2, "org.tasks.opentasks", 2000), // highest lastSync for collection 2
Entry(9, 3, "org.tasks.opentasks", 1000),
)
syncstats.forEach { (id, collectionId, authority, lastSync) ->
db.execSQL(
"INSERT INTO syncstats (id, collectionId, authority, lastSync) VALUES (?, ?, ?, ?)",
arrayOf<Any?>(id, collectionId, authority, lastSync)
)
}
},
validate = { db ->
db.query("SELECT id, collectionId, dataType FROM syncstats ORDER BY id").use { cursor ->
val found = mutableListOf<Entry>()
db.query("SELECT id, collectionId, dataType FROM syncstats ORDER BY id").use { cursor ->
val idIdx = cursor.getColumnIndex("id")
val colIdx = cursor.getColumnIndex("collectionId")
val typeIdx = cursor.getColumnIndex("dataType")
while (cursor.moveToNext())
found.add(
Entry(cursor.getInt(idIdx), cursor.getLong(colIdx), cursor.getString(typeIdx))
)
}
// Expect one TASKS row per collection (collections 1, 2, 3)
assertEquals(
listOf(
Entry(1, 1, "CONTACTS"),
Entry(2, 1, "EVENTS"),
Entry(5, 1, "TASKS"), // highest lastSync TASK for collection 1 is JTX Board
Entry(8, 2, "TASKS"), // highest lastSync TASK for collection 2
Entry(9, 3, "TASKS"), // only TASK for collection 3
), found
)
}
}
)
data class Entry(
val id: Int,
val collectionId: Long,
val dataType: String? = null,
val lastSync: Long? = null
)
}

View file

@ -0,0 +1,86 @@
/*
* Copyright © All Contributors. See LICENSE and AUTHORS in the root directory for details.
*/
package at.bitfire.davdroid.db.migration
import androidx.room.migration.AutoMigrationSpec
import androidx.room.migration.Migration
import androidx.room.testing.MigrationTestHelper
import androidx.sqlite.db.SupportSQLiteDatabase
import androidx.sqlite.db.framework.FrameworkSQLiteOpenHelperFactory
import androidx.test.platform.app.InstrumentationRegistry
import at.bitfire.davdroid.db.AppDatabase
import dagger.hilt.android.testing.HiltAndroidRule
import org.junit.Before
import org.junit.Rule
import javax.inject.Inject
/**
* Helper for testing the database migration from [toVersion] - 1 to [toVersion].
*
* @param toVersion The target version to migrate to.
*/
abstract class DatabaseMigrationTest(
private val toVersion: Int
) {
@Inject
lateinit var autoMigrations: Set<@JvmSuppressWildcards AutoMigrationSpec>
@Inject
lateinit var manualMigrations: Set<@JvmSuppressWildcards Migration>
@get:Rule
val hiltRule = HiltAndroidRule(this)
@Before
fun setup() {
hiltRule.inject()
}
/**
* Used for testing the migration process from [toVersion]-1 to [toVersion].
*
* Note: SQLite's foreign key constraint enforcement is not enabled in tests. We need
* to enable it ourselves using setting "PRAGMA foreign_keys=ON" directly after opening
* a new database connection (works per connection). In tests it's usually more practical
* not to do so, however. In production database connections room enables it for us.
*
* @param prepare Callback to prepare the database. Will be run with database schema in version [toVersion] - 1.
* @param validate Callback to validate the migration result. Will be run with database schema in version [toVersion].
*/
protected fun testMigration(
prepare: (SupportSQLiteDatabase) -> Unit,
validate: (SupportSQLiteDatabase) -> Unit
) {
val helper = MigrationTestHelper(
InstrumentationRegistry.getInstrumentation(),
AppDatabase::class.java,
autoMigrations.toList(),
FrameworkSQLiteOpenHelperFactory()
)
// Prepare the database with the initial version.
val dbName = "test"
helper.createDatabase(dbName, version = toVersion - 1).apply {
// We could enable foreign key constraint enforcement here
// by setting "PRAGMA foreign_keys=ON".
prepare(this)
close()
}
// Re-open the database with the new version and provide all the migrations.
val db = helper.runMigrationsAndValidate(
name = dbName,
version = toVersion,
validateDroppedTables = true,
migrations = manualMigrations.toTypedArray()
)
validate(db)
}
}

View file

@ -0,0 +1,20 @@
/*
* Copyright © All Contributors. See LICENSE and AUTHORS in the root directory for details.
*/
package at.bitfire.davdroid.di
import at.bitfire.davdroid.sync.FakeSyncAdapter
import at.bitfire.davdroid.sync.adapter.SyncAdapter
import at.bitfire.davdroid.sync.adapter.SyncAdapterImpl
import dagger.Binds
import dagger.Module
import dagger.hilt.components.SingletonComponent
import dagger.hilt.testing.TestInstallIn
@Module
@TestInstallIn(components = [SingletonComponent::class], replaces = [SyncAdapterImpl.RealSyncAdapterModule::class])
abstract class FakeSyncAdapterModule {
@Binds
abstract fun provide(impl: FakeSyncAdapter): SyncAdapter
}

View file

@ -0,0 +1,59 @@
/*
* Copyright © All Contributors. See LICENSE and AUTHORS in the root directory for details.
*/
package at.bitfire.davdroid.di
import at.bitfire.davdroid.di.TestCoroutineDispatchersModule.standardTestDispatcher
import dagger.Module
import dagger.Provides
import dagger.hilt.components.SingletonComponent
import dagger.hilt.testing.TestInstallIn
import kotlinx.coroutines.CoroutineDispatcher
import kotlinx.coroutines.Dispatchers
import kotlinx.coroutines.ExperimentalCoroutinesApi
import kotlinx.coroutines.test.StandardTestDispatcher
import kotlinx.coroutines.test.setMain
/**
* Provides test dispatchers to be injected instead of the normal ones.
*
* The [standardTestDispatcher] is set as main dispatcher in [at.bitfire.davdroid.HiltTestRunner],
* so that tests can just use [kotlinx.coroutines.test.runTest] without providing [standardTestDispatcher].
*/
@Module
@TestInstallIn(
components = [SingletonComponent::class],
replaces = [CoroutineDispatchersModule::class]
)
object TestCoroutineDispatchersModule {
private val standardTestDispatcher = StandardTestDispatcher()
@Provides
@DefaultDispatcher
fun defaultDispatcher(): CoroutineDispatcher = standardTestDispatcher
@Provides
@IoDispatcher
fun ioDispatcher(): CoroutineDispatcher = standardTestDispatcher
@Provides
@MainDispatcher
fun mainDispatcher(): CoroutineDispatcher = standardTestDispatcher
@Provides
@SyncDispatcher
fun syncDispatcher(): CoroutineDispatcher = standardTestDispatcher
/**
* Sets the [standardTestDispatcher] as [Dispatchers.Main] so that test dispatchers
* created in the future use the same scheduler. See [StandardTestDispatcher] docs
* for more information.
*/
@OptIn(ExperimentalCoroutinesApi::class)
fun initMainDispatcher() {
Dispatchers.setMain(standardTestDispatcher)
}
}

View file

@ -0,0 +1,24 @@
/*
* Copyright © All Contributors. See LICENSE and AUTHORS in the root directory for details.
*/
package at.bitfire.davdroid.di
import at.bitfire.davdroid.startup.StartupPlugin
import at.bitfire.davdroid.startup.TasksAppWatcher
import dagger.Module
import dagger.hilt.components.SingletonComponent
import dagger.hilt.testing.TestInstallIn
import dagger.multibindings.Multibinds
// remove TasksAppWatcherModule from Android tests
@Module
@TestInstallIn(
components = [SingletonComponent::class],
replaces = [TasksAppWatcher.TasksAppWatcherModule::class]
)
abstract class TestTasksAppWatcherModule {
// provides empty set of plugins
@Multibinds
abstract fun empty(): Set<StartupPlugin>
}

View file

@ -0,0 +1,35 @@
/*
* Copyright © All Contributors. See LICENSE and AUTHORS in the root directory for details.
*/
package at.bitfire.davdroid.network
import android.os.Build
import androidx.test.filters.SdkSuppress
import org.junit.Assert.assertEquals
import org.junit.Test
import org.xbill.DNS.ARecord
import org.xbill.DNS.Lookup
import org.xbill.DNS.Type
import java.net.Inet4Address
import java.net.InetAddress
class Android10ResolverTest {
val FQDN_DAVX5 = "www.davx5.com"
@Test
@SdkSuppress(minSdkVersion = Build.VERSION_CODES.Q, maxSdkVersion = 34)
fun testResolveA() {
val www = InetAddress.getAllByName(FQDN_DAVX5).filterIsInstance<Inet4Address>().first()
val srvLookup = Lookup(FQDN_DAVX5, Type.A)
srvLookup.setResolver(Android10Resolver())
val resultGeneric = srvLookup.run()
assertEquals(1, resultGeneric.size)
val result = resultGeneric.first() as ARecord
assertEquals(www, result.address)
}
}

View file

@ -0,0 +1,122 @@
/*
* Copyright © All Contributors. See LICENSE and AUTHORS in the root directory for details.
*/
package at.bitfire.davdroid.network
import dagger.hilt.android.testing.HiltAndroidRule
import dagger.hilt.android.testing.HiltAndroidTest
import io.mockk.every
import io.mockk.mockk
import org.junit.Assert.assertArrayEquals
import org.junit.Assert.assertEquals
import org.junit.Assert.assertNull
import org.junit.Assert.assertTrue
import org.junit.Before
import org.junit.Rule
import org.junit.Test
import org.xbill.DNS.DClass
import org.xbill.DNS.Name
import org.xbill.DNS.SRVRecord
import org.xbill.DNS.TXTRecord
import javax.inject.Inject
import kotlin.random.Random
@HiltAndroidTest
class DnsRecordResolverTest {
@get:Rule
val hiltRule = HiltAndroidRule(this)
@Inject
lateinit var dnsRecordResolver: DnsRecordResolver
@Before
fun setup() {
hiltRule.inject()
}
@Test
fun testBestSRVRecord_Empty() {
assertNull(dnsRecordResolver.bestSRVRecord(emptyArray()))
}
@Test
fun testBestSRVRecord_MultipleRecords_Priority_Different() {
val dns1010 = SRVRecord(
Name.fromString("_caldavs._tcp.example.com."),
DClass.IN, 3600, 10, 10, 8443, Name.fromString("dav1010.example.com.")
)
val dns2010 = SRVRecord(
Name.fromString("_caldavs._tcp.example.com."),
DClass.IN, 3600, 20, 20, 8443, Name.fromString("dav2010.example.com.")
)
// lowest priority first
val result = dnsRecordResolver.bestSRVRecord(arrayOf(dns1010, dns2010))
assertEquals(dns1010, result)
}
@Test
fun testBestSRVRecord_MultipleRecords_Priority_Same() {
val dns1010 = SRVRecord(
Name.fromString("_caldavs._tcp.example.com."),
DClass.IN, 3600, 10, 10, 8443, Name.fromString("dav1010.example.com.")
)
val dns1020 = SRVRecord(
Name.fromString("_caldavs._tcp.example.com."),
DClass.IN, 3600, 10, 20, 8443, Name.fromString("dav1020.example.com.")
)
val dns1030 = SRVRecord(
Name.fromString("_caldavs._tcp.example.com."),
DClass.IN, 3600, 10, 30, 8443, Name.fromString("dav1030.example.com.")
)
val records = arrayOf(dns1010, dns1020, dns1030)
val randomNumberGenerator = mockk<Random>()
for (i in 0..60) {
every { randomNumberGenerator.nextInt(0, 61) } returns i
val expected = when (i) {
in 0..10 -> dns1010
in 11..30 -> dns1020
else -> dns1030
}
assertEquals(expected, dnsRecordResolver.bestSRVRecord(records, randomNumberGenerator))
}
}
@Test
fun testBestSRVRecord_OneRecord() {
val dns1010 = SRVRecord(
Name.fromString("_caldavs._tcp.example.com."),
DClass.IN, 3600, 10, 10, 8443, Name.fromString("dav1010.example.com.")
)
val result = dnsRecordResolver.bestSRVRecord(arrayOf(dns1010))
assertEquals(dns1010, result)
}
@Test
fun testPathsFromTXTRecords_Empty() {
assertTrue(dnsRecordResolver.pathsFromTXTRecords(arrayOf()).isEmpty())
}
@Test
fun testPathsFromTXTRecords_OnePath() {
val result = dnsRecordResolver.pathsFromTXTRecords(arrayOf(
TXTRecord(Name.fromString("example.com."), 0, 0L, listOf("something=else", "path=/path1"))
)).toTypedArray()
assertArrayEquals(arrayOf("/path1"), result)
}
@Test
fun testPathsFromTXTRecords_TwoPaths() {
val result = dnsRecordResolver.pathsFromTXTRecords(arrayOf(
TXTRecord(Name.fromString("example.com."), 0, 0L, listOf("path=/path1", "something-else", "path=/path2"))
)).toTypedArray()
result.sort()
assertArrayEquals(arrayOf("/path1", "/path2"), result)
}
}

View file

@ -0,0 +1,88 @@
/*
* Copyright © All Contributors. See LICENSE and AUTHORS in the root directory for details.
*/
package at.bitfire.davdroid.network
import android.security.NetworkSecurityPolicy
import dagger.hilt.android.testing.HiltAndroidRule
import dagger.hilt.android.testing.HiltAndroidTest
import okhttp3.Request
import okhttp3.mockwebserver.MockResponse
import okhttp3.mockwebserver.MockWebServer
import org.junit.After
import org.junit.Assert.assertEquals
import org.junit.Assert.assertNull
import org.junit.Assert.assertTrue
import org.junit.Assume
import org.junit.Before
import org.junit.Rule
import org.junit.Test
import javax.inject.Inject
@HiltAndroidTest
class HttpClientTest {
@get:Rule
var hiltRule = HiltAndroidRule(this)
@Inject
lateinit var httpClientBuilder: HttpClient.Builder
lateinit var httpClient: HttpClient
lateinit var server: MockWebServer
@Before
fun setUp() {
hiltRule.inject()
httpClient = httpClientBuilder.build()
server = MockWebServer()
server.start(30000)
}
@After
fun tearDown() {
server.shutdown()
httpClient.close()
}
@Test
fun testCookies() {
Assume.assumeTrue(NetworkSecurityPolicy.getInstance().isCleartextTrafficPermitted)
val url = server.url("/test")
// set cookie for root path (/) and /test path in first response
server.enqueue(MockResponse()
.setResponseCode(200)
.addHeader("Set-Cookie", "cookie1=1; path=/")
.addHeader("Set-Cookie", "cookie2=2")
.setBody("Cookie set"))
httpClient.okHttpClient.newCall(Request.Builder()
.get().url(url)
.build()).execute()
assertNull(server.takeRequest().getHeader("Cookie"))
// cookie should be sent with second request
// second response lets first cookie expire and overwrites second cookie
server.enqueue(MockResponse()
.addHeader("Set-Cookie", "cookie1=1a; path=/; Max-Age=0")
.addHeader("Set-Cookie", "cookie2=2a")
.setResponseCode(200))
httpClient.okHttpClient.newCall(Request.Builder()
.get().url(url)
.build()).execute()
val header = server.takeRequest().getHeader("Cookie")
assertTrue(header == "cookie1=1; cookie2=2" || header == "cookie2=2; cookie1=1")
server.enqueue(MockResponse()
.setResponseCode(200))
httpClient.okHttpClient.newCall(Request.Builder()
.get().url(url)
.build()).execute()
assertEquals("cookie2=2a", server.takeRequest().getHeader("Cookie"))
}
}

View file

@ -0,0 +1,46 @@
/*
* Copyright © All Contributors. See LICENSE and AUTHORS in the root directory for details.
*/
package at.bitfire.davdroid.network
import androidx.test.filters.SdkSuppress
import dagger.hilt.android.testing.HiltAndroidRule
import dagger.hilt.android.testing.HiltAndroidTest
import okhttp3.Request
import org.junit.Before
import org.junit.Rule
import org.junit.Test
import javax.inject.Inject
@HiltAndroidTest
class OkhttpClientTest {
@Inject
lateinit var httpClientBuilder: HttpClient.Builder
@get:Rule
val hiltRule = HiltAndroidRule(this)
@Before
fun inject() {
hiltRule.inject()
}
@Test
@SdkSuppress(maxSdkVersion = 34)
fun testIcloudWithSettings() {
httpClientBuilder.build().use { client ->
client.okHttpClient
.newCall(
Request.Builder()
.get()
.url("https://icloud.com")
.build()
)
.execute()
}
}
}

View file

@ -0,0 +1,46 @@
/*
* Copyright © All Contributors. See LICENSE and AUTHORS in the root directory for details.
*/
package at.bitfire.davdroid.push
import dagger.hilt.android.testing.HiltAndroidRule
import dagger.hilt.android.testing.HiltAndroidTest
import org.junit.Assert
import org.junit.Before
import org.junit.Rule
import org.junit.Test
import javax.inject.Inject
@HiltAndroidTest
class PushMessageHandlerTest {
@get:Rule
val hiltRule = HiltAndroidRule(this)
@Inject
lateinit var handler: PushMessageHandler
@Before
fun setUp() {
hiltRule.inject()
}
@Test
fun testParse_InvalidXml() {
Assert.assertNull(handler.parse("Non-XML content"))
}
@Test
fun testParse_WithXmlDeclAndTopic() {
val topic = handler.parse(
"<?xml version=\"1.0\" encoding=\"utf-8\" ?>" +
"<P:push-message xmlns:D=\"DAV:\" xmlns:P=\"https://bitfire.at/webdav-push\">" +
" <P:topic>O7M1nQ7cKkKTKsoS_j6Z3w</P:topic>" +
"</P:push-message>"
)
Assert.assertEquals("O7M1nQ7cKkKTKsoS_j6Z3w", topic)
}
}

View file

@ -0,0 +1,102 @@
/*
* Copyright © All Contributors. See LICENSE and AUTHORS in the root directory for details.
*/
package at.bitfire.davdroid.push
import android.content.Context
import android.content.Intent
import android.os.IBinder
import androidx.test.rule.ServiceTestRule
import dagger.hilt.android.qualifiers.ApplicationContext
import dagger.hilt.android.testing.BindValue
import dagger.hilt.android.testing.HiltAndroidRule
import dagger.hilt.android.testing.HiltAndroidTest
import io.mockk.coVerify
import io.mockk.confirmVerified
import io.mockk.every
import io.mockk.impl.annotations.RelaxedMockK
import io.mockk.junit4.MockKRule
import io.mockk.mockk
import kotlinx.coroutines.ExperimentalCoroutinesApi
import kotlinx.coroutines.test.advanceUntilIdle
import kotlinx.coroutines.test.runTest
import org.junit.Before
import org.junit.Rule
import org.junit.Test
import org.unifiedpush.android.connector.FailedReason
import org.unifiedpush.android.connector.PushService
import org.unifiedpush.android.connector.data.PushEndpoint
import javax.inject.Inject
@OptIn(ExperimentalCoroutinesApi::class)
@HiltAndroidTest
class UnifiedPushServiceTest {
@get:Rule
val hiltRule = HiltAndroidRule(this)
@get:Rule
val mockKRule = MockKRule(this)
@get:Rule
val serviceTestRule = ServiceTestRule()
@Inject
@ApplicationContext
lateinit var context: Context
@RelaxedMockK
@BindValue
lateinit var pushRegistrationManager: PushRegistrationManager
lateinit var binder: IBinder
lateinit var unifiedPushService: UnifiedPushService
@Before
fun setUp() {
hiltRule.inject()
binder = serviceTestRule.bindService(Intent(context, UnifiedPushService::class.java))!!
unifiedPushService = (binder as PushService.PushBinder).getService() as UnifiedPushService
}
@Test
fun testOnNewEndpoint() = runTest {
val endpoint = mockk<PushEndpoint> {
every { url } returns "https://example.com/12"
}
unifiedPushService.onNewEndpoint(endpoint, "12")
advanceUntilIdle()
coVerify {
pushRegistrationManager.processSubscription(12, endpoint)
}
confirmVerified(pushRegistrationManager)
}
@Test
fun testOnRegistrationFailed() = runTest {
unifiedPushService.onRegistrationFailed(FailedReason.INTERNAL_ERROR, "34")
advanceUntilIdle()
coVerify {
pushRegistrationManager.removeSubscription(34)
}
confirmVerified(pushRegistrationManager)
}
@Test
fun testOnUnregistered() = runTest {
unifiedPushService.onUnregistered("45")
advanceUntilIdle()
coVerify {
pushRegistrationManager.removeSubscription(45)
}
confirmVerified(pushRegistrationManager)
}
}

View file

@ -0,0 +1,213 @@
/*
* Copyright © All Contributors. See LICENSE and AUTHORS in the root directory for details.
*/
package at.bitfire.davdroid.repository
import android.accounts.Account
import android.accounts.AccountManager
import android.content.Context
import androidx.hilt.work.HiltWorkerFactory
import at.bitfire.davdroid.R
import at.bitfire.davdroid.TestUtils
import at.bitfire.davdroid.resource.LocalAddressBookStore
import at.bitfire.davdroid.resource.LocalCalendarStore
import at.bitfire.davdroid.resource.LocalDataStore
import at.bitfire.davdroid.settings.AccountSettings
import at.bitfire.davdroid.sync.AutomaticSyncManager
import at.bitfire.davdroid.sync.SyncDataType
import at.bitfire.davdroid.sync.TasksAppManager
import at.bitfire.davdroid.sync.account.AccountsCleanupWorker
import at.bitfire.davdroid.sync.account.TestAccount
import at.bitfire.davdroid.sync.worker.SyncWorkerManager
import dagger.hilt.android.qualifiers.ApplicationContext
import dagger.hilt.android.testing.BindValue
import dagger.hilt.android.testing.HiltAndroidRule
import dagger.hilt.android.testing.HiltAndroidTest
import io.mockk.clearAllMocks
import io.mockk.coVerify
import io.mockk.every
import io.mockk.impl.annotations.MockK
import io.mockk.junit4.MockKRule
import io.mockk.mockk
import io.mockk.mockkObject
import io.mockk.unmockkObject
import io.mockk.verify
import junit.framework.TestCase.assertTrue
import kotlinx.coroutines.test.runTest
import org.junit.After
import org.junit.Before
import org.junit.Rule
import org.junit.Test
import javax.inject.Inject
@HiltAndroidTest
class AccountRepositoryTest {
@get:Rule
val hiltRule = HiltAndroidRule(this)
@get:Rule
val mockKRule = MockKRule(this)
// System under test
@Inject
lateinit var accountRepository: AccountRepository
// Real injections
@Inject
@ApplicationContext
lateinit var context: Context
@Inject
lateinit var accountSettingsFactory: AccountSettings.Factory
@Inject
lateinit var workerFactory: HiltWorkerFactory
// Dependency overrides
@BindValue @MockK(relaxed = true)
lateinit var automaticSyncManager: AutomaticSyncManager
@BindValue @MockK(relaxed = true)
lateinit var localAddressBookStore: LocalAddressBookStore
@BindValue @MockK(relaxed = true)
lateinit var localCalendarStore: LocalCalendarStore
@BindValue @MockK(relaxed = true)
lateinit var serviceRepository: DavServiceRepository
@BindValue @MockK(relaxed = true)
lateinit var syncWorkerManager: SyncWorkerManager
@BindValue @MockK(relaxed = true)
lateinit var tasksAppManager: TasksAppManager
// Account setup
private val newName = "Renamed Account"
lateinit var am: AccountManager
lateinit var accountType: String
lateinit var account: Account
@Before
fun setUp() {
hiltRule.inject()
TestUtils.setUpWorkManager(context, workerFactory)
// Account setup
am = AccountManager.get(context)
accountType = context.getString(R.string.account_type)
account = TestAccount.create()
// AccountsCleanupWorker static mocking
mockkObject(AccountsCleanupWorker)
every { AccountsCleanupWorker.lockAccountsCleanup() } returns Unit
}
@After
fun tearDown() {
am.getAccountsByType(accountType).forEach { account ->
am.removeAccountExplicitly(account)
}
unmockkObject(AccountsCleanupWorker)
clearAllMocks()
}
// testRename
@Test(expected = IllegalArgumentException::class)
fun testRename_checksForAlreadyExisting() = runTest {
val existing = Account("Existing Account", accountType)
am.addAccountExplicitly(existing, null, null)
accountRepository.rename(account.name, existing.name)
}
@Test
fun testRename_locksAccountsCleanup() = runTest {
accountRepository.rename(account.name, newName)
verify { AccountsCleanupWorker.lockAccountsCleanup() }
}
@Test
fun testRename_renamesAccountInAndroid() = runTest {
accountRepository.rename(account.name, newName)
val accountsAfter = am.getAccountsByType(accountType)
assertTrue(accountsAfter.any { it.name == newName })
}
@Test
fun testRename_cancelsRunningSynchronizationOfOldAccount() = runTest {
accountRepository.rename(account.name, newName)
coVerify { syncWorkerManager.cancelAllWork(account) }
}
@Test
fun testRename_disablesPeriodicSyncsForOldAccount() = runTest {
accountRepository.rename(account.name, newName)
for (dataType in SyncDataType.entries)
coVerify(exactly = 1) {
syncWorkerManager.disablePeriodic(account, dataType)
}
}
@Test
fun testRename_updatesAccountNameReferencesInDatabase() = runTest {
accountRepository.rename(account.name, newName)
coVerify { serviceRepository.renameAccount(account.name, newName) }
}
@Test
fun testRename_updatesAddressBooks() = runTest {
accountRepository.rename(account.name, newName)
val newAccount = accountRepository.fromName(newName)
coVerify { localAddressBookStore.updateAccount(account, newAccount) }
}
@Test
fun testRename_updatesCalendarEvents() = runTest {
accountRepository.rename(account.name, newName)
val newAccount = accountRepository.fromName(newName)
coVerify { localCalendarStore.updateAccount(account, newAccount) }
}
@Test
fun testRename_updatesAccountNameOfLocalTasks() = runTest {
val mockDataStore = mockk<LocalDataStore<*>>(relaxed = true)
every { tasksAppManager.getDataStore() } returns mockDataStore
accountRepository.rename(account.name, newName)
coVerify { mockDataStore.updateAccount(account, accountRepository.fromName(newName)) }
}
@Test
fun testRename_updatesAutomaticSync() = runTest {
accountRepository.rename(account.name, newName)
val newAccount = accountRepository.fromName(newName)
coVerify { automaticSyncManager.updateAutomaticSync(newAccount) }
}
@Test
fun testRename_releasesAccountsCleanupWorkerMutex() = runTest {
accountRepository.rename(account.name, newName)
verify { AccountsCleanupWorker.lockAccountsCleanup() }
coVerify { serviceRepository.renameAccount(account.name, newName) }
}
}

View file

@ -0,0 +1,225 @@
/*
* Copyright © All Contributors. See LICENSE and AUTHORS in the root directory for details.
*/
package at.bitfire.davdroid.resource
import android.accounts.Account
import android.accounts.AccountManager
import android.content.ContentProviderClient
import android.content.Context
import at.bitfire.davdroid.R
import at.bitfire.davdroid.db.AppDatabase
import at.bitfire.davdroid.db.Collection
import at.bitfire.davdroid.db.Service
import at.bitfire.davdroid.sync.account.TestAccount
import dagger.hilt.android.qualifiers.ApplicationContext
import dagger.hilt.android.testing.HiltAndroidRule
import dagger.hilt.android.testing.HiltAndroidTest
import io.mockk.every
import io.mockk.impl.annotations.RelaxedMockK
import io.mockk.junit4.MockKRule
import io.mockk.mockk
import io.mockk.mockkObject
import okhttp3.HttpUrl.Companion.toHttpUrl
import org.junit.After
import org.junit.Assert.assertEquals
import org.junit.Assert.assertFalse
import org.junit.Assert.assertTrue
import org.junit.Before
import org.junit.Rule
import org.junit.Test
import javax.inject.Inject
@HiltAndroidTest
class LocalAddressBookStoreTest {
@Inject @ApplicationContext
lateinit var context: Context
@Inject
lateinit var db: AppDatabase
@Inject
lateinit var localAddressBookStore: LocalAddressBookStore
@RelaxedMockK
lateinit var provider: ContentProviderClient
@get:Rule
val hiltRule = HiltAndroidRule(this)
@get:Rule
val mockkRule = MockKRule(this)
lateinit var addressBookAccountType: String
lateinit var addressBookAccount: Account
lateinit var account: Account
lateinit var service: Service
@Before
fun setUp() {
hiltRule.inject()
addressBookAccountType = context.getString(R.string.account_type_address_book)
account = TestAccount.create()
service = Service(
id = 200,
accountName = account.name,
type = Service.Companion.TYPE_CARDDAV,
principal = null
)
db.serviceDao().insertOrReplace(service)
addressBookAccount = Account(
"MrRobert@example.com",
addressBookAccountType
)
}
@After
fun tearDown() {
TestAccount.remove(account)
removeAddressBooks()
}
@Test
fun test_accountName_removesSpecialChars() {
// Should remove iso control characters and `, ", ',
val collection = mockk<Collection> {
every { id } returns 1
every { url } returns "https://example.com/addressbook/funnyfriends".toHttpUrl()
every { displayName } returns "手 M's_\"F-e\"\\(´д`)/;æøå% äöü #42"
every { serviceId } returns service.id
}
assertEquals("手 Ms_F-e\\(´д)/;æøå% äöü #42 (Test Account) #1", localAddressBookStore.accountName(collection))
}
@Test
fun test_accountName_missingService() {
val collection = mockk<Collection> {
every { id } returns 42
every { url } returns "https://example.com/addressbook/funnyfriends".toHttpUrl()
every { displayName } returns null
every { serviceId } returns 404 // missing service
}
assertEquals("funnyfriends #42", localAddressBookStore.accountName(collection))
}
@Test
fun test_accountName_missingDisplayName() {
val collection = mockk<Collection> {
every { id } returns 42
every { url } returns "https://example.com/addressbook/funnyfriends".toHttpUrl()
every { displayName } returns null
every { serviceId } returns service.id
}
val accountName = localAddressBookStore.accountName(collection)
assertEquals("funnyfriends (${account.name}) #42", accountName)
}
@Test
fun test_accountName_missingDisplayNameAndService() {
val collection = mockk<Collection> {
every { id } returns 1
every { url } returns "https://example.com/addressbook/funnyfriends".toHttpUrl()
every { displayName } returns null
every { serviceId } returns 404 // missing service
}
assertEquals("funnyfriends #1", localAddressBookStore.accountName(collection))
}
@Test
fun test_create_createAccountReturnsNull() {
val collection = mockk<Collection>(relaxed = true) {
every { serviceId } returns service.id
every { id } returns 1
every { url } returns "https://example.com/addressbook/funnyfriends".toHttpUrl()
}
mockkObject(localAddressBookStore)
every { localAddressBookStore.createAddressBookAccount(any(), any(), any()) } returns null
assertEquals(null, localAddressBookStore.create(provider, collection))
}
@Test
fun test_create_ReadOnly() {
val collection = mockk<Collection>(relaxed = true) {
every { serviceId } returns service.id
every { id } returns 1
every { url } returns "https://example.com/addressbook/funnyfriends".toHttpUrl()
every { readOnly() } returns true
}
val addrBook = localAddressBookStore.create(provider, collection)!!
assertEquals(Account("funnyfriends (Test Account) #1", addressBookAccountType), addrBook.addressBookAccount)
assertTrue(addrBook.readOnly)
}
@Test
fun test_create_ReadWrite() {
val collection = mockk<Collection>(relaxed = true) {
every { serviceId } returns service.id
every { id } returns 1
every { url } returns "https://example.com/addressbook/funnyfriends".toHttpUrl()
every { readOnly() } returns false
}
val addrBook = localAddressBookStore.create(provider, collection)!!
assertEquals(Account("funnyfriends (Test Account) #1", addressBookAccountType), addrBook.addressBookAccount)
assertFalse(addrBook.readOnly)
}
@Test
fun test_getAll_differentAccount() {
val accountManager = AccountManager.get(context)
mockkObject(accountManager)
every { accountManager.getAccountsByType(any()) } returns arrayOf(addressBookAccount)
every { accountManager.getUserData(addressBookAccount, LocalAddressBook.USER_DATA_ACCOUNT_NAME) } returns "Another Unrelated Account"
every { accountManager.getUserData(addressBookAccount, LocalAddressBook.USER_DATA_ACCOUNT_TYPE) } returns account.type
val result = localAddressBookStore.getAll(account, provider)
assertTrue(result.isEmpty())
}
@Test
fun test_getAll_sameAccount() {
val accountManager = AccountManager.get(context)
mockkObject(accountManager)
every { accountManager.getAccountsByType(any()) } returns arrayOf(addressBookAccount)
every { accountManager.getUserData(addressBookAccount, LocalAddressBook.USER_DATA_ACCOUNT_NAME) } returns account.name
every { accountManager.getUserData(addressBookAccount, LocalAddressBook.USER_DATA_ACCOUNT_TYPE) } returns account.type
val result = localAddressBookStore.getAll(account, provider)
assertEquals(1, result.size)
assertEquals(addressBookAccount, result.first().addressBookAccount)
}
/**
* Tests the calculation of read only state is correct
*/
@Test
fun test_shouldBeReadOnly() {
val collectionReadOnly = mockk<Collection> { every { readOnly() } returns true }
assertTrue(LocalAddressBookStore.shouldBeReadOnly(collectionReadOnly, false))
assertTrue(LocalAddressBookStore.shouldBeReadOnly(collectionReadOnly, true))
val collectionNotReadOnly = mockk<Collection> { every { readOnly() } returns false }
assertFalse(LocalAddressBookStore.shouldBeReadOnly(collectionNotReadOnly, false))
assertTrue(LocalAddressBookStore.shouldBeReadOnly(collectionNotReadOnly, true))
}
// helpers
private fun removeAddressBooks() {
val accountManager = AccountManager.get(context)
accountManager.getAccountsByType(addressBookAccountType).forEach {
accountManager.removeAccountExplicitly(it)
}
}
}

View file

@ -0,0 +1,177 @@
/*
* Copyright © All Contributors. See LICENSE and AUTHORS in the root directory for details.
*/
package at.bitfire.davdroid.resource
import android.Manifest
import android.accounts.Account
import android.content.ContentProviderClient
import android.content.ContentUris
import android.content.Context
import android.provider.ContactsContract
import androidx.test.platform.app.InstrumentationRegistry
import androidx.test.rule.GrantPermissionRule
import at.bitfire.vcard4android.Contact
import at.bitfire.vcard4android.LabeledProperty
import dagger.hilt.android.qualifiers.ApplicationContext
import dagger.hilt.android.testing.HiltAndroidRule
import dagger.hilt.android.testing.HiltAndroidTest
import ezvcard.property.Telephone
import org.junit.AfterClass
import org.junit.Assert.assertEquals
import org.junit.Assert.assertFalse
import org.junit.Assert.assertTrue
import org.junit.Before
import org.junit.BeforeClass
import org.junit.ClassRule
import org.junit.Rule
import org.junit.Test
import java.io.FileNotFoundException
import java.util.LinkedList
import java.util.Optional
import javax.inject.Inject
@HiltAndroidTest
class LocalAddressBookTest {
@Inject @ApplicationContext
lateinit var context: Context
@Inject
lateinit var localTestAddressBookProvider: LocalTestAddressBookProvider
@get:Rule
val hiltRule = HiltAndroidRule(this)
val account = Account("Test Account", "Test Account Type")
@Before
fun setUp() {
hiltRule.inject()
}
/**
* Tests whether contacts are moved (and not lost) when an address book is renamed.
*/
@Test
fun test_renameAccount_retainsContacts() {
localTestAddressBookProvider.provide(account, provider) { addressBook ->
// insert contact with data row
val uid = "12345"
val contact = Contact(
uid = uid,
displayName = "Test Contact",
phoneNumbers = LinkedList(listOf(LabeledProperty(Telephone("1234567890"))))
)
val uri = LocalContact(addressBook, contact, null, null, 0).add()
val id = ContentUris.parseId(uri)
val localContact = addressBook.findContactById(id)
localContact.resetDirty()
assertFalse("Contact is dirty before moving", isContactDirty(addressBook, id))
// rename address book
val newName = "New Name"
addressBook.renameAccount(newName)
assertEquals(newName, addressBook.addressBookAccount.name)
// check whether contact is still here (including data rows) and not dirty
val result = addressBook.findContactById(id)
assertFalse("Contact is dirty after moving", isContactDirty(addressBook, id))
val contact2 = result.getContact()
assertEquals(uid, contact2.uid)
assertEquals("Test Contact", contact2.displayName)
assertEquals("1234567890", contact2.phoneNumbers.first().component1().text)
}
}
/**
* Tests whether groups are moved (and not lost) when an address book is renamed.
*/
@Test
fun test_renameAccount_retainsGroups() {
localTestAddressBookProvider.provide(account, provider) { addressBook ->
// insert group
val localGroup = LocalGroup(addressBook, Contact(displayName = "Test Group"), null, null, 0)
val uri = localGroup.add()
val id = ContentUris.parseId(uri)
// make sure it's not dirty
localGroup.clearDirty(Optional.empty(), null, null)
assertFalse("Group is dirty before moving", isGroupDirty(addressBook, id))
// rename address book
val newName = "New Name"
assertTrue(addressBook.renameAccount(newName))
assertEquals(newName, addressBook.addressBookAccount.name)
// check whether group is still here and not dirty
val result = addressBook.findGroupById(id)
assertFalse("Group is dirty after moving", isGroupDirty(addressBook, id))
val group = result.getContact()
assertEquals("Test Group", group.displayName)
}
}
// helpers
/**
* Returns the dirty flag of the given contact.
*
* @return true if the contact is dirty, false otherwise
*
* @throws FileNotFoundException if the contact can't be found
*/
fun isContactDirty(adddressBook: LocalAddressBook, id: Long): Boolean {
val uri = ContentUris.withAppendedId(adddressBook.rawContactsSyncUri(), id)
provider.query(uri, arrayOf(ContactsContract.RawContacts.DIRTY), null, null, null)?.use { cursor ->
if (cursor.moveToFirst())
return cursor.getInt(0) != 0
}
throw FileNotFoundException()
}
/**
* Returns the dirty flag of the given contact group.
*
* @return true if the group is dirty, false otherwise
*
* @throws FileNotFoundException if the group can't be found
*/
fun isGroupDirty(adddressBook: LocalAddressBook, id: Long): Boolean {
val uri = ContentUris.withAppendedId(adddressBook.groupsSyncUri(), id)
provider.query(uri, arrayOf(ContactsContract.Groups.DIRTY), null, null, null)?.use { cursor ->
if (cursor.moveToFirst())
return cursor.getInt(0) != 0
}
throw FileNotFoundException()
}
companion object {
@JvmField
@ClassRule
val permissionRule = GrantPermissionRule.grant(Manifest.permission.READ_CONTACTS, Manifest.permission.WRITE_CONTACTS)!!
private lateinit var provider: ContentProviderClient
@BeforeClass
@JvmStatic
fun connect() {
val context = InstrumentationRegistry.getInstrumentation().context
provider = context.contentResolver.acquireContentProviderClient(ContactsContract.AUTHORITY)!!
}
@AfterClass
@JvmStatic
fun disconnect() {
provider.close()
}
}
}

View file

@ -0,0 +1,236 @@
/*
* Copyright © All Contributors. See LICENSE and AUTHORS in the root directory for details.
*/
package at.bitfire.davdroid.resource
import android.accounts.Account
import android.content.ContentProviderClient
import android.content.ContentUris
import android.content.ContentValues
import android.content.Entity
import android.provider.CalendarContract
import android.provider.CalendarContract.ACCOUNT_TYPE_LOCAL
import android.provider.CalendarContract.Events
import androidx.core.content.contentValuesOf
import androidx.test.platform.app.InstrumentationRegistry
import at.bitfire.ical4android.Event
import at.bitfire.ical4android.util.MiscUtils.asSyncAdapter
import at.bitfire.ical4android.util.MiscUtils.closeCompat
import at.bitfire.synctools.storage.calendar.AndroidCalendar
import at.bitfire.synctools.storage.calendar.AndroidCalendarProvider
import at.bitfire.synctools.storage.calendar.AndroidEvent2
import at.bitfire.synctools.test.InitCalendarProviderRule
import dagger.hilt.android.testing.HiltAndroidRule
import dagger.hilt.android.testing.HiltAndroidTest
import net.fortuna.ical4j.model.property.DtStart
import net.fortuna.ical4j.model.property.RRule
import net.fortuna.ical4j.model.property.RecurrenceId
import net.fortuna.ical4j.model.property.Status
import org.junit.After
import org.junit.Assert.assertEquals
import org.junit.Assert.assertNotNull
import org.junit.Assert.assertNull
import org.junit.Before
import org.junit.Rule
import org.junit.Test
import org.junit.rules.TestRule
import javax.inject.Inject
@HiltAndroidTest
class LocalCalendarTest {
@get:Rule
val hiltRule = HiltAndroidRule(this)
@get:Rule
val initCalendarProviderRule: TestRule = InitCalendarProviderRule.initialize()
@Inject
lateinit var localCalendarFactory: LocalCalendar.Factory
private val account = Account("LocalCalendarTest", ACCOUNT_TYPE_LOCAL)
private lateinit var androidCalendar: AndroidCalendar
private lateinit var client: ContentProviderClient
private lateinit var calendar: LocalCalendar
@Before
fun setUp() {
hiltRule.inject()
val context = InstrumentationRegistry.getInstrumentation().targetContext
client = context.contentResolver.acquireContentProviderClient(CalendarContract.AUTHORITY)!!
val provider = AndroidCalendarProvider(account, client)
androidCalendar = provider.createAndGetCalendar(ContentValues())
calendar = localCalendarFactory.create(androidCalendar)
}
@After
fun tearDown() {
androidCalendar.delete()
client.closeCompat()
}
@Test
fun testDeleteDirtyEventsWithoutInstances_NoInstances_CancelledExceptions() {
// create recurring event with only deleted/cancelled instances
val event = Event().apply {
dtStart = DtStart("20220120T010203Z")
summary = "Event with 3 instances"
rRules.add(RRule("FREQ=DAILY;COUNT=3"))
exceptions.add(Event().apply {
recurrenceId = RecurrenceId("20220120T010203Z")
dtStart = DtStart("20220120T010203Z")
summary = "Cancelled exception on 1st day"
status = Status.VEVENT_CANCELLED
})
exceptions.add(Event().apply {
recurrenceId = RecurrenceId("20220121T010203Z")
dtStart = DtStart("20220121T010203Z")
summary = "Cancelled exception on 2nd day"
status = Status.VEVENT_CANCELLED
})
exceptions.add(Event().apply {
recurrenceId = RecurrenceId("20220122T010203Z")
dtStart = DtStart("20220122T010203Z")
summary = "Cancelled exception on 3rd day"
status = Status.VEVENT_CANCELLED
})
}
calendar.add(
event = event,
fileName = "filename.ics",
eTag = null,
scheduleTag = null,
flags = LocalResource.FLAG_REMOTELY_PRESENT
)
val localEvent = calendar.findByName("filename.ics")!!
val eventId = localEvent.id
// set event as dirty
client.update(ContentUris.withAppendedId(Events.CONTENT_URI.asSyncAdapter(account), eventId), ContentValues(1).apply {
put(Events.DIRTY, 1)
}, null, null)
// this method should mark the event as deleted
calendar.deleteDirtyEventsWithoutInstances()
// verify that event is now marked as deleted
client.query(
ContentUris.withAppendedId(Events.CONTENT_URI.asSyncAdapter(account), eventId),
arrayOf(Events.DELETED), null, null, null
)!!.use { cursor ->
cursor.moveToNext()
assertEquals(1, cursor.getInt(0))
}
}
@Test
// Needs InitCalendarProviderRule
fun testDeleteDirtyEventsWithoutInstances_Recurring_Instances() {
val event = Event().apply {
dtStart = DtStart("20220120T010203Z")
summary = "Event with 3 instances"
rRules.add(RRule("FREQ=DAILY;COUNT=3"))
}
calendar.add(
event = event,
fileName = "filename.ics",
eTag = null,
scheduleTag = null,
flags = LocalResource.FLAG_REMOTELY_PRESENT
)
val localEvent = calendar.findByName("filename.ics")!!
val eventUrl = androidCalendar.eventUri(localEvent.id)
// set event as dirty
client.update(eventUrl, contentValuesOf(
Events.DIRTY to 1
), null, null)
// this method should mark the event as deleted
calendar.deleteDirtyEventsWithoutInstances()
// verify that event is not marked as deleted
client.query(eventUrl, arrayOf(Events.DELETED), null, null, null)!!.use { cursor ->
cursor.moveToNext()
assertEquals(0, cursor.getInt(0))
}
}
/**
* Verifies that [LocalCalendar.removeNotDirtyMarked] works as expected.
* @param contentValues values to set on the event. Required:
* - [Events._ID]
* - [Events.DIRTY]
*/
private fun testRemoveNotDirtyMarked(contentValues: ContentValues) {
val id = androidCalendar.addEvent(Entity(
contentValuesOf(
Events.CALENDAR_ID to androidCalendar.id,
Events.DTSTART to System.currentTimeMillis(),
Events.DTEND to System.currentTimeMillis(),
Events.TITLE to "Some Event",
AndroidEvent2.COLUMN_FLAGS to 123
).apply { putAll(contentValues) }
))
calendar.removeNotDirtyMarked(123)
assertNull(androidCalendar.getEvent(id))
}
@Test
fun testRemoveNotDirtyMarked_IdLargerThanIntMaxValue() = testRemoveNotDirtyMarked(
contentValuesOf(Events._ID to Int.MAX_VALUE.toLong() + 10, Events.DIRTY to 0)
)
@Test
fun testRemoveNotDirtyMarked_DirtyIs0() = testRemoveNotDirtyMarked(
contentValuesOf(Events._ID to 1, Events.DIRTY to 0)
)
@Test
fun testRemoveNotDirtyMarked_DirtyNull() = testRemoveNotDirtyMarked(
contentValuesOf(Events._ID to 1, Events.DIRTY to null)
)
/**
* Verifies that [LocalCalendar.markNotDirty] works as expected.
* @param contentValues values to set on the event. Required:
* - [Events.DIRTY]
*/
private fun testMarkNotDirty(contentValues: ContentValues) {
val id = androidCalendar.addEvent(Entity(
contentValuesOf(
Events.CALENDAR_ID to androidCalendar.id,
Events._ID to 1,
Events.DTSTART to System.currentTimeMillis(),
Events.DTEND to System.currentTimeMillis(),
Events.TITLE to "Some Event",
AndroidEvent2.COLUMN_FLAGS to 123
).apply { putAll(contentValues) }
))
val updated = calendar.markNotDirty(321)
assertEquals(1, updated)
assertEquals(321, androidCalendar.getEvent(id)?.flags)
}
@Test
fun test_markNotDirty_DirtyIs0() = testMarkNotDirty(
contentValuesOf(
Events.DIRTY to 0
)
)
@Test
fun test_markNotDirty_DirtyIsNull() = testMarkNotDirty(
contentValuesOf(
Events.DIRTY to null
)
)
}

View file

@ -0,0 +1,265 @@
/*
* Copyright © All Contributors. See LICENSE and AUTHORS in the root directory for details.
*/
package at.bitfire.davdroid.resource
import android.Manifest
import android.accounts.Account
import android.content.ContentProviderClient
import android.content.ContentUris
import android.content.ContentValues
import android.provider.CalendarContract
import android.provider.CalendarContract.ACCOUNT_TYPE_LOCAL
import android.provider.CalendarContract.Events
import androidx.test.platform.app.InstrumentationRegistry
import androidx.test.rule.GrantPermissionRule
import at.bitfire.ical4android.Event
import at.bitfire.ical4android.util.MiscUtils.closeCompat
import at.bitfire.synctools.storage.calendar.AndroidCalendarProvider
import at.techbee.jtx.JtxContract.asSyncAdapter
import dagger.hilt.android.testing.HiltAndroidRule
import dagger.hilt.android.testing.HiltAndroidTest
import net.fortuna.ical4j.model.property.DtStart
import net.fortuna.ical4j.model.property.RRule
import net.fortuna.ical4j.model.property.RecurrenceId
import net.fortuna.ical4j.model.property.Status
import org.junit.After
import org.junit.Assert.assertEquals
import org.junit.Before
import org.junit.Rule
import org.junit.Test
import java.util.UUID
import javax.inject.Inject
@HiltAndroidTest
class LocalEventTest {
@get:Rule
val hiltRule = HiltAndroidRule(this)
@get:Rule
val permissionRule = GrantPermissionRule.grant(Manifest.permission.READ_CALENDAR, Manifest.permission.WRITE_CALENDAR)
@Inject
lateinit var localCalendarFactory: LocalCalendar.Factory
private val account = Account("LocalCalendarTest", ACCOUNT_TYPE_LOCAL)
private lateinit var client: ContentProviderClient
private lateinit var calendar: LocalCalendar
@Before
fun setUp() {
hiltRule.inject()
val context = InstrumentationRegistry.getInstrumentation().targetContext
client = context.contentResolver.acquireContentProviderClient(CalendarContract.AUTHORITY)!!
val provider = AndroidCalendarProvider(account, client)
calendar = localCalendarFactory.create(provider.createAndGetCalendar(ContentValues()))
}
@After
fun tearDown() {
calendar.androidCalendar.delete()
client.closeCompat()
}
@Test
fun testPrepareForUpload_NoUid() {
// create event
val event = Event().apply {
dtStart = DtStart("20220120T010203Z")
summary = "Event without uid"
}
calendar.add(
event = event,
fileName = "filename.ics",
eTag = null,
scheduleTag = null,
flags = LocalResource.FLAG_REMOTELY_PRESENT
)
val localEvent = calendar.findByName("filename.ics")!!
// prepare for upload - this should generate a new random uuid, returned as filename
val fileNameWithSuffix = localEvent.prepareForUpload()
val fileName = fileNameWithSuffix.removeSuffix(".ics")
// throws an exception if fileName is not an UUID
UUID.fromString(fileName)
// UID in calendar storage should be the same as file name
client.query(
ContentUris.withAppendedId(Events.CONTENT_URI, localEvent.id!!).asSyncAdapter(account),
arrayOf(Events.UID_2445), null, null, null
)!!.use { cursor ->
cursor.moveToFirst()
assertEquals(fileName, cursor.getString(0))
}
}
@Test
fun testPrepareForUpload_NormalUid() {
// create event
val event = Event().apply {
dtStart = DtStart("20220120T010203Z")
summary = "Event with normal uid"
uid = "some-event@hostname.tld" // old UID format, UUID would be new format
}
calendar.add(
event = event,
fileName = "filename.ics",
eTag = null,
scheduleTag = null,
flags = LocalResource.FLAG_REMOTELY_PRESENT
)
val localEvent = calendar.findByName("filename.ics")!!
// prepare for upload - this should use the UID for the file name
val fileNameWithSuffix = localEvent.prepareForUpload()
val fileName = fileNameWithSuffix.removeSuffix(".ics")
assertEquals(event.uid, fileName)
// UID in calendar storage should still be set, too
client.query(
ContentUris.withAppendedId(Events.CONTENT_URI, localEvent.id!!).asSyncAdapter(account),
arrayOf(Events.UID_2445), null, null, null
)!!.use { cursor ->
cursor.moveToFirst()
assertEquals(fileName, cursor.getString(0))
}
}
@Test
fun testPrepareForUpload_UidHasDangerousChars() {
// create event
val event = Event().apply {
dtStart = DtStart("20220120T010203Z")
summary = "Event with funny uid"
uid = "https://www.example.com/events/asdfewfe-cxyb-ewrws-sadfrwerxyvser-asdfxye-"
}
calendar.add(
event = event,
fileName = "filename.ics",
eTag = null,
scheduleTag = null,
flags = LocalResource.FLAG_REMOTELY_PRESENT
)
val localEvent = calendar.findByName("filename.ics")!!
// prepare for upload - this should generate a new random uuid, returned as filename
val fileNameWithSuffix = localEvent.prepareForUpload()
val fileName = fileNameWithSuffix.removeSuffix(".ics")
// throws an exception if fileName is not an UUID
UUID.fromString(fileName)
// UID in calendar storage shouldn't have been changed
client.query(
ContentUris.withAppendedId(Events.CONTENT_URI, localEvent.id!!).asSyncAdapter(account),
arrayOf(Events.UID_2445), null, null, null
)!!.use { cursor ->
cursor.moveToFirst()
assertEquals(event.uid, cursor.getString(0))
}
}
@Test
fun testDeleteDirtyEventsWithoutInstances_NoInstances_Exdate() {
// TODO
}
@Test
fun testDeleteDirtyEventsWithoutInstances_NoInstances_CancelledExceptions() {
// create recurring event with only deleted/cancelled instances
val event = Event().apply {
dtStart = DtStart("20220120T010203Z")
summary = "Event with 3 instances"
rRules.add(RRule("FREQ=DAILY;COUNT=3"))
exceptions.add(Event().apply {
recurrenceId = RecurrenceId("20220120T010203Z")
dtStart = DtStart("20220120T010203Z")
summary = "Cancelled exception on 1st day"
status = Status.VEVENT_CANCELLED
})
exceptions.add(Event().apply {
recurrenceId = RecurrenceId("20220121T010203Z")
dtStart = DtStart("20220121T010203Z")
summary = "Cancelled exception on 2nd day"
status = Status.VEVENT_CANCELLED
})
exceptions.add(Event().apply {
recurrenceId = RecurrenceId("20220122T010203Z")
dtStart = DtStart("20220122T010203Z")
summary = "Cancelled exception on 3rd day"
status = Status.VEVENT_CANCELLED
})
}
calendar.add(
event = event,
fileName = "filename.ics",
eTag = null,
scheduleTag = null,
flags = LocalResource.FLAG_REMOTELY_PRESENT
)
val localEvent = calendar.findByName("filename.ics")!!
val eventId = localEvent.id!!
// set event as dirty
client.update(ContentUris.withAppendedId(Events.CONTENT_URI.asSyncAdapter(account), eventId), ContentValues(1).apply {
put(Events.DIRTY, 1)
}, null, null)
// this method should mark the event as deleted
calendar.deleteDirtyEventsWithoutInstances()
// verify that event is now marked as deleted
client.query(
ContentUris.withAppendedId(Events.CONTENT_URI.asSyncAdapter(account), eventId),
arrayOf(Events.DELETED), null, null, null
)!!.use { cursor ->
cursor.moveToNext()
assertEquals(1, cursor.getInt(0))
}
}
@Test
fun testDeleteDirtyEventsWithoutInstances_Recurring_Instances() {
val event = Event().apply {
dtStart = DtStart("20220120T010203Z")
summary = "Event with 3 instances"
rRules.add(RRule("FREQ=DAILY;COUNT=3"))
}
calendar.add(
event = event,
fileName = "filename.ics",
eTag = null,
scheduleTag = null,
flags = LocalResource.FLAG_REMOTELY_PRESENT
)
val localEvent = calendar.findByName("filename.ics")!!
val eventId = localEvent.id!!
// set event as dirty
client.update(ContentUris.withAppendedId(Events.CONTENT_URI.asSyncAdapter(account), eventId), ContentValues(1).apply {
put(Events.DIRTY, 1)
}, null, null)
// this method should mark the event as deleted
calendar.deleteDirtyEventsWithoutInstances()
// verify that event is not marked as deleted
client.query(
ContentUris.withAppendedId(Events.CONTENT_URI.asSyncAdapter(account), eventId),
arrayOf(Events.DELETED), null, null, null
)!!.use { cursor ->
cursor.moveToNext()
assertEquals(0, cursor.getInt(0))
}
}
}

View file

@ -0,0 +1,278 @@
/*
* Copyright © All Contributors. See LICENSE and AUTHORS in the root directory for details.
*/
package at.bitfire.davdroid.resource
import android.Manifest
import android.accounts.Account
import android.content.ContentProviderClient
import android.content.ContentUris
import android.content.ContentValues
import android.content.Context
import android.provider.ContactsContract
import android.provider.ContactsContract.CommonDataKinds.GroupMembership
import androidx.test.platform.app.InstrumentationRegistry
import androidx.test.rule.GrantPermissionRule
import at.bitfire.synctools.storage.ContactsBatchOperation
import at.bitfire.vcard4android.CachedGroupMembership
import at.bitfire.vcard4android.Contact
import at.bitfire.vcard4android.GroupMethod
import dagger.hilt.android.qualifiers.ApplicationContext
import dagger.hilt.android.testing.HiltAndroidRule
import dagger.hilt.android.testing.HiltAndroidTest
import org.junit.After
import org.junit.Assert.assertEquals
import org.junit.Assert.assertFalse
import org.junit.Assert.assertNotNull
import org.junit.Assert.assertNull
import org.junit.Assert.assertTrue
import org.junit.Before
import org.junit.Rule
import org.junit.Test
import java.util.Optional
import javax.inject.Inject
@HiltAndroidTest
class LocalGroupTest {
@get:Rule
val hiltRule = HiltAndroidRule(this)
@get:Rule
val permissionRule = GrantPermissionRule.grant(Manifest.permission.READ_CONTACTS, Manifest.permission.WRITE_CONTACTS)!!
@Inject @ApplicationContext
lateinit var context: Context
@Inject
lateinit var localTestAddressBookProvider: LocalTestAddressBookProvider
lateinit var provider: ContentProviderClient
val account = Account("Test Account", "Test Account Type")
@Before
fun setUp() {
hiltRule.inject()
val context = InstrumentationRegistry.getInstrumentation().context
provider = context.contentResolver.acquireContentProviderClient(ContactsContract.AUTHORITY)!!
}
@After
fun tearDown() {
provider.close()
}
@Test
fun testApplyPendingMemberships_addPendingMembership() {
localTestAddressBookProvider.provide(account, provider, GroupMethod.GROUP_VCARDS) { ab ->
val contact1 = LocalContact(ab, Contact().apply {
uid = "test1"
displayName = "Test"
}, "test1.vcf", null, 0)
contact1.add()
val group = newGroup(ab)
// set pending membership of contact1
ab.provider!!.update(
ContentUris.withAppendedId(ab.groupsSyncUri(), group.id!!),
ContentValues().apply {
put(LocalGroup.COLUMN_PENDING_MEMBERS, LocalGroup.PendingMemberships(setOf("test1")).toString())
},
null, null
)
// pending membership -> contact1 should be added to group
LocalGroup.applyPendingMemberships(ab)
// check group membership
ab.provider!!.query(
ab.syncAdapterURI(ContactsContract.Data.CONTENT_URI), arrayOf(GroupMembership.GROUP_ROW_ID, GroupMembership.RAW_CONTACT_ID),
"${GroupMembership.MIMETYPE}=?", arrayOf(GroupMembership.CONTENT_ITEM_TYPE),
null
)!!.use { cursor ->
assertTrue(cursor.moveToNext())
assertEquals(group.id, cursor.getLong(0))
assertEquals(contact1.id, cursor.getLong(1))
assertFalse(cursor.moveToNext())
}
// check cached group membership
ab.provider!!.query(
ab.syncAdapterURI(ContactsContract.Data.CONTENT_URI), arrayOf(CachedGroupMembership.GROUP_ID, CachedGroupMembership.RAW_CONTACT_ID),
"${CachedGroupMembership.MIMETYPE}=?", arrayOf(CachedGroupMembership.CONTENT_ITEM_TYPE),
null
)!!.use { cursor ->
assertTrue(cursor.moveToNext())
assertEquals(group.id, cursor.getLong(0))
assertEquals(contact1.id, cursor.getLong(1))
assertFalse(cursor.moveToNext())
}
}
}
@Test
fun testApplyPendingMemberships_removeMembership() {
localTestAddressBookProvider.provide(account, provider, GroupMethod.GROUP_VCARDS) { ab ->
val contact1 = LocalContact(ab, Contact().apply {
uid = "test1"
displayName = "Test"
}, "test1.vcf", null, 0)
contact1.add()
val group = newGroup(ab)
// add contact1 to group
val batch = ContactsBatchOperation(ab.provider!!)
contact1.addToGroup(batch, group.id!!)
batch.commit()
// no pending memberships -> membership should be removed
LocalGroup.applyPendingMemberships(ab)
// check group membership
ab.provider!!.query(
ab.syncAdapterURI(ContactsContract.Data.CONTENT_URI),
arrayOf(GroupMembership.GROUP_ROW_ID, GroupMembership.RAW_CONTACT_ID),
"${GroupMembership.MIMETYPE}=?",
arrayOf(GroupMembership.CONTENT_ITEM_TYPE),
null
)!!.use { cursor ->
assertFalse(cursor.moveToNext())
}
// check cached group membership
ab.provider!!.query(
ab.syncAdapterURI(ContactsContract.Data.CONTENT_URI),
arrayOf(CachedGroupMembership.GROUP_ID, CachedGroupMembership.RAW_CONTACT_ID),
"${CachedGroupMembership.MIMETYPE}=?",
arrayOf(CachedGroupMembership.CONTENT_ITEM_TYPE),
null
)!!.use { cursor ->
assertFalse(cursor.moveToNext())
}
}
}
@Test
fun testClearDirty_addCachedGroupMembership() {
localTestAddressBookProvider.provide(account, provider, GroupMethod.CATEGORIES) { ab ->
val group = newGroup(ab)
val contact1 =
LocalContact(ab, Contact().apply { displayName = "Test" }, "fn.vcf", null, 0)
contact1.add()
// insert group membership, but no cached group membership
ab.provider!!.insert(
ab.syncAdapterURI(ContactsContract.Data.CONTENT_URI), ContentValues().apply {
put(GroupMembership.MIMETYPE, GroupMembership.CONTENT_ITEM_TYPE)
put(GroupMembership.RAW_CONTACT_ID, contact1.id)
put(GroupMembership.GROUP_ROW_ID, group.id)
}
)
group.clearDirty(Optional.empty(), null)
// check cached group membership
ab.provider!!.query(
ab.syncAdapterURI(ContactsContract.Data.CONTENT_URI),
arrayOf(CachedGroupMembership.GROUP_ID, CachedGroupMembership.RAW_CONTACT_ID),
"${CachedGroupMembership.MIMETYPE}=?",
arrayOf(CachedGroupMembership.CONTENT_ITEM_TYPE),
null
)!!.use { cursor ->
assertTrue(cursor.moveToNext())
assertEquals(group.id, cursor.getLong(0))
assertEquals(contact1.id, cursor.getLong(1))
assertFalse(cursor.moveToNext())
}
}
}
@Test
fun testClearDirty_removeCachedGroupMembership() {
localTestAddressBookProvider.provide(account, provider, GroupMethod.CATEGORIES) { ab ->
val group = newGroup(ab)
val contact1 = LocalContact(ab, Contact().apply { displayName = "Test" }, "fn.vcf", null, 0)
contact1.add()
// insert cached group membership, but no group membership
ab.provider!!.insert(
ab.syncAdapterURI(ContactsContract.Data.CONTENT_URI), ContentValues().apply {
put(CachedGroupMembership.MIMETYPE, CachedGroupMembership.CONTENT_ITEM_TYPE)
put(CachedGroupMembership.RAW_CONTACT_ID, contact1.id)
put(CachedGroupMembership.GROUP_ID, group.id)
}
)
group.clearDirty(Optional.empty(), null)
// cached group membership should be gone
ab.provider!!.query(
ab.syncAdapterURI(ContactsContract.Data.CONTENT_URI), arrayOf(CachedGroupMembership.GROUP_ID, CachedGroupMembership.RAW_CONTACT_ID),
"${CachedGroupMembership.MIMETYPE}=?", arrayOf(CachedGroupMembership.CONTENT_ITEM_TYPE),
null
)!!.use { cursor ->
assertFalse(cursor.moveToNext())
}
}
}
@Test
fun testMarkMembersDirty() {
localTestAddressBookProvider.provide(account, provider, GroupMethod.CATEGORIES) { ab ->
val group = newGroup(ab)
val contact1 =
LocalContact(ab, Contact().apply { displayName = "Test" }, "fn.vcf", null, 0)
contact1.add()
val batch = ContactsBatchOperation(ab.provider!!)
contact1.addToGroup(batch, group.id!!)
batch.commit()
assertEquals(0, ab.findDirty().size)
group.markMembersDirty()
assertEquals(contact1.id, ab.findDirty().first().id)
}
}
@Test
fun testPrepareForUpload() {
localTestAddressBookProvider.provide(account, provider, GroupMethod.CATEGORIES) { ab ->
val group = newGroup(ab)
assertNull(group.getContact().uid)
val fileName = group.prepareForUpload()
val newUid = group.getContact().uid
assertNotNull(newUid)
assertEquals("$newUid.vcf", fileName)
}
}
@Test
fun testUpdate() {
localTestAddressBookProvider.provide(account, provider) { ab ->
val group = newGroup(ab)
group.update(Contact(displayName = "New Group Name"), null, null, null, 0)
}
}
// helpers
private fun newGroup(addressBook: LocalAddressBook): LocalGroup =
LocalGroup(addressBook,
Contact().apply {
displayName = "Test Group"
}, null, null, 0
).apply {
add()
}
}

View file

@ -0,0 +1,59 @@
/*
* Copyright © All Contributors. See LICENSE and AUTHORS in the root directory for details.
*/
package at.bitfire.davdroid.resource
import android.accounts.Account
import android.content.ContentProviderClient
import android.content.Context
import at.bitfire.davdroid.repository.DavCollectionRepository
import at.bitfire.davdroid.repository.DavServiceRepository
import at.bitfire.davdroid.settings.AccountSettings
import at.bitfire.davdroid.sync.adapter.SyncFrameworkIntegration
import at.bitfire.vcard4android.GroupMethod
import dagger.assisted.Assisted
import dagger.assisted.AssistedFactory
import dagger.assisted.AssistedInject
import dagger.hilt.android.qualifiers.ApplicationContext
import java.util.Optional
import java.util.logging.Logger
/**
* A local address book that provides an easy way to set the group method in tests.
*/
class LocalTestAddressBook @AssistedInject constructor(
@Assisted account: Account,
@Assisted("addressBook") addressBookAccount: Account,
@Assisted provider: ContentProviderClient,
@Assisted override val groupMethod: GroupMethod,
accountSettingsFactory: AccountSettings.Factory,
collectionRepository: DavCollectionRepository,
@ApplicationContext context: Context,
logger: Logger,
serviceRepository: DavServiceRepository,
syncFramework: SyncFrameworkIntegration
): LocalAddressBook(
account = account,
_addressBookAccount = addressBookAccount,
provider = provider,
accountSettingsFactory = accountSettingsFactory,
collectionRepository = collectionRepository,
context = context,
dirtyVerifier = Optional.empty(),
logger = logger,
serviceRepository = serviceRepository,
syncFramework = syncFramework
) {
@AssistedFactory
interface Factory {
fun create(
account: Account,
@Assisted("addressBook") addressBookAccount: Account,
provider: ContentProviderClient,
groupMethod: GroupMethod
): LocalTestAddressBook
}
}

View file

@ -0,0 +1,72 @@
/*
* Copyright © All Contributors. See LICENSE and AUTHORS in the root directory for details.
*/
package at.bitfire.davdroid.resource
import android.accounts.Account
import android.accounts.AccountManager
import android.content.ContentProviderClient
import android.content.Context
import at.bitfire.davdroid.R
import at.bitfire.vcard4android.GroupMethod
import dagger.hilt.android.qualifiers.ApplicationContext
import org.junit.Assert.assertTrue
import java.util.concurrent.atomic.AtomicInteger
import javax.inject.Inject
/**
* Provides [LocalTestAddressBook]s in tests.
*/
class LocalTestAddressBookProvider @Inject constructor(
@ApplicationContext context: Context,
private val localTestAddressBookFactory: LocalTestAddressBook.Factory
) {
/**
* Counter for creating unique address book names.
*/
val counter = AtomicInteger()
val accountManager = AccountManager.get(context)
val accountType = context.getString(R.string.account_type_address_book)
/**
* Creates and provides a new temporary [LocalTestAddressBook] for the given [account] and
* removes it again.
*
* @param account The DAVx5 account to use for the address book
* @param provider Content provider needed to access and modify the address book
* @param groupMethod The group method the address book should use
* @param block Function to execute with the temporary available address book
*/
fun provide(
account: Account,
provider: ContentProviderClient,
groupMethod: GroupMethod = GroupMethod.GROUP_VCARDS,
block: (LocalTestAddressBook) -> Unit
) {
// create new address book account
val addressBookAccount = Account("Test Address Book ${counter.incrementAndGet()}", accountType)
assertTrue(accountManager.addAccountExplicitly(addressBookAccount, null, null))
val addressBook = localTestAddressBookFactory.create(account, addressBookAccount, provider, groupMethod)
// Empty the address book (Needed by LocalGroupTest)
for (contact in addressBook.queryContacts(null, null))
contact.delete()
for (group in addressBook.queryGroups(null, null))
group.delete()
try {
// provide address book
block(addressBook)
} finally {
// recreate account of provided address book, since the account might have been renamed
val renamedAccount = Account(addressBook.addressBookAccount.name, addressBook.addressBookAccount.type)
// remove address book account / address book
assertTrue(accountManager.removeAccountExplicitly(renamedAccount))
}
}
}

View file

@ -0,0 +1,90 @@
/*
* Copyright © All Contributors. See LICENSE and AUTHORS in the root directory for details.
*/
package at.bitfire.davdroid.resource.contactrow
import android.Manifest
import android.accounts.Account
import android.content.ContentProviderClient
import android.content.ContentValues
import android.content.Context
import android.provider.ContactsContract
import androidx.test.platform.app.InstrumentationRegistry
import androidx.test.rule.GrantPermissionRule
import at.bitfire.davdroid.resource.LocalContact
import at.bitfire.davdroid.resource.LocalTestAddressBookProvider
import at.bitfire.vcard4android.CachedGroupMembership
import at.bitfire.vcard4android.Contact
import at.bitfire.vcard4android.GroupMethod
import dagger.hilt.android.qualifiers.ApplicationContext
import dagger.hilt.android.testing.HiltAndroidRule
import dagger.hilt.android.testing.HiltAndroidTest
import org.junit.AfterClass
import org.junit.Assert.assertArrayEquals
import org.junit.Before
import org.junit.BeforeClass
import org.junit.ClassRule
import org.junit.Rule
import org.junit.Test
import javax.inject.Inject
@HiltAndroidTest
class CachedGroupMembershipHandlerTest {
@Inject
@ApplicationContext
lateinit var context: Context
@Inject
lateinit var localTestAddressBookProvider: LocalTestAddressBookProvider
@get:Rule
val hiltRule = HiltAndroidRule(this)
val account = Account("Test Account", "Test Account Type")
@Before
fun inject() {
hiltRule.inject()
}
@Test
fun testMembership() {
localTestAddressBookProvider.provide(account, provider, GroupMethod.GROUP_VCARDS) { addressBook ->
val contact = Contact()
val localContact = LocalContact(addressBook, contact, null, null, 0)
CachedGroupMembershipHandler(localContact).handle(ContentValues().apply {
put(CachedGroupMembership.GROUP_ID, 123456)
put(CachedGroupMembership.RAW_CONTACT_ID, 789)
}, contact)
assertArrayEquals(arrayOf(123456L), localContact.cachedGroupMemberships.toArray())
}
}
companion object {
@JvmField
@ClassRule
val permissionRule = GrantPermissionRule.grant(Manifest.permission.READ_CONTACTS, Manifest.permission.WRITE_CONTACTS)!!
private lateinit var provider: ContentProviderClient
@BeforeClass
@JvmStatic
fun connect() {
val context = InstrumentationRegistry.getInstrumentation().context
provider = context.contentResolver.acquireContentProviderClient(ContactsContract.AUTHORITY)!!
}
@AfterClass
@JvmStatic
fun disconnect() {
provider.close()
}
}
}

View file

@ -0,0 +1,103 @@
/*
* Copyright © All Contributors. See LICENSE and AUTHORS in the root directory for details.
*/
package at.bitfire.davdroid.resource.contactrow
import android.Manifest
import android.accounts.Account
import android.content.ContentProviderClient
import android.content.Context
import android.net.Uri
import android.provider.ContactsContract
import android.provider.ContactsContract.CommonDataKinds.GroupMembership
import androidx.test.platform.app.InstrumentationRegistry
import androidx.test.rule.GrantPermissionRule
import at.bitfire.davdroid.resource.LocalTestAddressBookProvider
import at.bitfire.vcard4android.Contact
import at.bitfire.vcard4android.GroupMethod
import dagger.hilt.android.qualifiers.ApplicationContext
import dagger.hilt.android.testing.HiltAndroidRule
import dagger.hilt.android.testing.HiltAndroidTest
import org.junit.AfterClass
import org.junit.Assert.assertEquals
import org.junit.Before
import org.junit.BeforeClass
import org.junit.ClassRule
import org.junit.Rule
import org.junit.Test
import javax.inject.Inject
@HiltAndroidTest
class GroupMembershipBuilderTest {
@Inject @ApplicationContext
lateinit var context: Context
@Inject
lateinit var localTestAddressBookProvider: LocalTestAddressBookProvider
@get:Rule
val hiltRule = HiltAndroidRule(this)
val account = Account("Test Account", "Test Account Type")
@Before
fun inject() {
hiltRule.inject()
}
@Test
fun testCategories_GroupsAsCategories() {
val contact = Contact().apply {
categories += "TEST GROUP"
}
localTestAddressBookProvider.provide(account, provider, GroupMethod.CATEGORIES) { addressBookGroupsAsCategories ->
GroupMembershipBuilder(Uri.EMPTY, null, contact, addressBookGroupsAsCategories, false).build().also { result ->
assertEquals(1, result.size)
assertEquals(GroupMembership.CONTENT_ITEM_TYPE, result[0].values[GroupMembership.MIMETYPE])
assertEquals(addressBookGroupsAsCategories.findOrCreateGroup("TEST GROUP"), result[0].values[GroupMembership.GROUP_ROW_ID])
}
}
}
@Test
fun testCategories_GroupsAsVCards() {
val contact = Contact().apply {
categories += "TEST GROUP"
}
localTestAddressBookProvider.provide(account, provider, GroupMethod.GROUP_VCARDS) { addressBookGroupsAsVCards ->
GroupMembershipBuilder(Uri.EMPTY, null, contact, addressBookGroupsAsVCards, false).build().also { result ->
// group membership is constructed during post-processing
assertEquals(0, result.size)
}
}
}
companion object {
@JvmField
@ClassRule
val permissionRule = GrantPermissionRule.grant(Manifest.permission.READ_CONTACTS, Manifest.permission.WRITE_CONTACTS)!!
private lateinit var provider: ContentProviderClient
@BeforeClass
@JvmStatic
fun connect() {
val context: Context = InstrumentationRegistry.getInstrumentation().context
provider = context.contentResolver.acquireContentProviderClient(ContactsContract.AUTHORITY)!!
}
@AfterClass
@JvmStatic
fun disconnect() {
provider.close()
}
}
}

View file

@ -0,0 +1,110 @@
/*
* Copyright © All Contributors. See LICENSE and AUTHORS in the root directory for details.
*/
package at.bitfire.davdroid.resource.contactrow
import android.Manifest
import android.accounts.Account
import android.content.ContentProviderClient
import android.content.ContentValues
import android.content.Context
import android.provider.ContactsContract
import androidx.test.platform.app.InstrumentationRegistry
import androidx.test.rule.GrantPermissionRule
import at.bitfire.davdroid.resource.LocalContact
import at.bitfire.davdroid.resource.LocalTestAddressBookProvider
import at.bitfire.vcard4android.CachedGroupMembership
import at.bitfire.vcard4android.Contact
import at.bitfire.vcard4android.GroupMethod
import dagger.hilt.android.qualifiers.ApplicationContext
import dagger.hilt.android.testing.HiltAndroidRule
import dagger.hilt.android.testing.HiltAndroidTest
import org.junit.AfterClass
import org.junit.Assert
import org.junit.Assert.assertArrayEquals
import org.junit.Assert.assertTrue
import org.junit.Before
import org.junit.BeforeClass
import org.junit.ClassRule
import org.junit.Rule
import org.junit.Test
import javax.inject.Inject
@HiltAndroidTest
class GroupMembershipHandlerTest {
@Inject @ApplicationContext
lateinit var context: Context
@Inject
lateinit var localTestAddressBookProvider: LocalTestAddressBookProvider
@get:Rule
var hiltRule = HiltAndroidRule(this)
val account = Account("Test Account", "Test Account Type")
@Before
fun inject() {
hiltRule.inject()
}
@Test
fun testMembership_GroupsAsCategories() {
localTestAddressBookProvider.provide(account, provider, GroupMethod.CATEGORIES) { addressBookGroupsAsCategories ->
val addressBookGroupsAsCategoriesGroup = addressBookGroupsAsCategories.findOrCreateGroup("TEST GROUP")
val contact = Contact()
val localContact = LocalContact(addressBookGroupsAsCategories, contact, null, null, 0)
GroupMembershipHandler(localContact).handle(ContentValues().apply {
put(CachedGroupMembership.GROUP_ID, addressBookGroupsAsCategoriesGroup)
put(CachedGroupMembership.RAW_CONTACT_ID, -1)
}, contact)
assertArrayEquals(arrayOf(addressBookGroupsAsCategoriesGroup), localContact.groupMemberships.toArray())
assertArrayEquals(arrayOf("TEST GROUP"), contact.categories.toArray())
}
}
@Test
fun testMembership_GroupsAsVCards() {
localTestAddressBookProvider.provide(account, provider, GroupMethod.GROUP_VCARDS) { addressBookGroupsAsVCards ->
val contact = Contact()
val localContact = LocalContact(addressBookGroupsAsVCards, contact, null, null, 0)
GroupMembershipHandler(localContact).handle(ContentValues().apply {
put(CachedGroupMembership.GROUP_ID, 12345) // because the group name is not queried and put into CATEGORIES, the group doesn't have to really exist
put(CachedGroupMembership.RAW_CONTACT_ID, -1)
}, contact)
assertArrayEquals(arrayOf(12345L), localContact.groupMemberships.toArray())
assertTrue(contact.categories.isEmpty())
}
}
companion object {
@JvmField
@ClassRule
val permissionRule = GrantPermissionRule.grant(Manifest.permission.READ_CONTACTS, Manifest.permission.WRITE_CONTACTS)!!
private lateinit var provider: ContentProviderClient
@BeforeClass
@JvmStatic
fun connect() {
val context: Context = InstrumentationRegistry.getInstrumentation().context
provider = context.contentResolver.acquireContentProviderClient(ContactsContract.AUTHORITY)!!
Assert.assertNotNull(provider)
}
@AfterClass
@JvmStatic
fun disconnect() {
provider.close()
}
}
}

View file

@ -0,0 +1,32 @@
/*
* Copyright © All Contributors. See LICENSE and AUTHORS in the root directory for details.
*/
package at.bitfire.davdroid.resource.contactrow
import android.net.Uri
import at.bitfire.vcard4android.Contact
import org.junit.Assert.assertEquals
import org.junit.Test
class UnknownPropertiesBuilderTest {
@Test
fun testUnknownProperties_None() {
UnknownPropertiesBuilder(Uri.EMPTY, null, Contact(), false).build().also { result ->
assertEquals(0, result.size)
}
}
@Test
fun testUnknownProperties_Properties() {
UnknownPropertiesBuilder(Uri.EMPTY, null, Contact().apply {
unknownProperties = "X-TEST:12345"
}, false).build().also { result ->
assertEquals(1, result.size)
assertEquals(UnknownProperties.CONTENT_ITEM_TYPE, result[0].values[UnknownProperties.MIMETYPE])
assertEquals("X-TEST:12345", result[0].values[UnknownProperties.UNKNOWN_PROPERTIES])
}
}
}

View file

@ -0,0 +1,33 @@
/*
* Copyright © All Contributors. See LICENSE and AUTHORS in the root directory for details.
*/
package at.bitfire.davdroid.resource.contactrow
import android.content.ContentValues
import at.bitfire.vcard4android.Contact
import org.junit.Assert.assertEquals
import org.junit.Assert.assertNull
import org.junit.Test
class UnknownPropertiesHandlerTest {
@Test
fun testUnknownProperties_Empty() {
val contact = Contact()
UnknownPropertiesHandler.handle(ContentValues().apply {
putNull(UnknownProperties.UNKNOWN_PROPERTIES)
}, contact)
assertNull(contact.unknownProperties)
}
@Test
fun testUnknownProperties_Values() {
val contact = Contact()
UnknownPropertiesHandler.handle(ContentValues().apply {
put(UnknownProperties.UNKNOWN_PROPERTIES, "X-TEST:12345")
}, contact)
assertEquals("X-TEST:12345", contact.unknownProperties)
}
}

View file

@ -0,0 +1,222 @@
/*
* Copyright © All Contributors. See LICENSE and AUTHORS in the root directory for details.
*/
package at.bitfire.davdroid.servicedetection
import android.security.NetworkSecurityPolicy
import at.bitfire.davdroid.db.AppDatabase
import at.bitfire.davdroid.db.Collection
import at.bitfire.davdroid.db.Service
import at.bitfire.davdroid.network.HttpClient
import at.bitfire.davdroid.settings.SettingsManager
import dagger.hilt.android.testing.BindValue
import dagger.hilt.android.testing.HiltAndroidRule
import dagger.hilt.android.testing.HiltAndroidTest
import io.mockk.impl.annotations.MockK
import io.mockk.junit4.MockKRule
import okhttp3.mockwebserver.Dispatcher
import okhttp3.mockwebserver.MockResponse
import okhttp3.mockwebserver.MockWebServer
import okhttp3.mockwebserver.RecordedRequest
import org.junit.After
import org.junit.Assert.assertEquals
import org.junit.Assume
import org.junit.Before
import org.junit.Rule
import org.junit.Test
import java.util.logging.Logger
import javax.inject.Inject
@HiltAndroidTest
class CollectionsWithoutHomeSetRefresherTest {
@get:Rule
val hiltRule = HiltAndroidRule(this)
@get:Rule
val mockKRule = MockKRule(this)
@Inject
lateinit var db: AppDatabase
@Inject
lateinit var httpClientBuilder: HttpClient.Builder
@Inject
lateinit var logger: Logger
@Inject
lateinit var refresherFactory: CollectionsWithoutHomeSetRefresher.Factory
@BindValue
@MockK(relaxed = true)
lateinit var settings: SettingsManager
private lateinit var client: HttpClient
private lateinit var mockServer: MockWebServer
private lateinit var service: Service
@Before
fun setUp() {
hiltRule.inject()
// Start mock web server
mockServer = MockWebServer().apply {
dispatcher = TestDispatcher(logger)
start()
}
// build HTTP client
client = httpClientBuilder.build()
Assume.assumeTrue(NetworkSecurityPolicy.getInstance().isCleartextTrafficPermitted)
// insert test service
val serviceId = db.serviceDao().insertOrReplace(
Service(id = 0, accountName = "test", type = Service.TYPE_CARDDAV, principal = null)
)
service = db.serviceDao().get(serviceId)!!
}
@After
fun tearDown() {
client.close()
mockServer.shutdown()
}
// refreshCollectionsWithoutHomeSet
@Test
fun refreshCollectionsWithoutHomeSet_updatesExistingCollection() {
// place homeless collection in DB
val collectionId = db.collectionDao().insertOrUpdateByUrl(
Collection(
0,
service.id,
null,
null,
Collection.TYPE_ADDRESSBOOK,
mockServer.url("$PATH_CARDDAV$SUBPATH_ADDRESSBOOK/"),
)
)
// Refresh
refresherFactory.create(service, client.okHttpClient).refreshCollectionsWithoutHomeSet()
// Check the collection got updated - with display name and description
assertEquals(
Collection(
collectionId,
service.id,
null,
1, // will have gotten an owner too
Collection.TYPE_ADDRESSBOOK,
mockServer.url("$PATH_CARDDAV$SUBPATH_ADDRESSBOOK/"),
displayName = "My Contacts",
description = "My Contacts Description"
),
db.collectionDao().get(collectionId)
)
}
@Test
fun refreshCollectionsWithoutHomeSet_deletesInaccessibleCollectionsWithoutHomeSet() {
// place homeless collection in DB - it is also inaccessible
val collectionId = db.collectionDao().insertOrUpdateByUrl(
Collection(
0,
service.id,
null,
null,
Collection.TYPE_ADDRESSBOOK,
mockServer.url("$PATH_CARDDAV$SUBPATH_ADDRESSBOOK_INACCESSIBLE")
)
)
// Refresh - should delete collection
refresherFactory.create(service, client.okHttpClient).refreshCollectionsWithoutHomeSet()
// Check the collection got deleted
assertEquals(null, db.collectionDao().get(collectionId))
}
@Test
fun refreshCollectionsWithoutHomeSet_addsOwnerUrls() {
// place homeless collection in DB
val collectionId = db.collectionDao().insertOrUpdateByUrl(
Collection(
0,
service.id,
null,
null,
Collection.TYPE_ADDRESSBOOK,
mockServer.url("$PATH_CARDDAV$SUBPATH_ADDRESSBOOK/"),
)
)
// Refresh homeless collections
assertEquals(0, db.principalDao().getByService(service.id).size)
refresherFactory.create(service, client.okHttpClient).refreshCollectionsWithoutHomeSet()
// Check principal saved and the collection was updated with its reference
val principals = db.principalDao().getByService(service.id)
assertEquals(1, principals.size)
assertEquals(mockServer.url("$PATH_CARDDAV$SUBPATH_PRINCIPAL"), principals[0].url)
assertEquals(null, principals[0].displayName)
assertEquals(
principals[0].id,
db.collectionDao().get(collectionId)!!.ownerId
)
}
companion object {
private const val PATH_CARDDAV = "/carddav"
private const val SUBPATH_PRINCIPAL = "/principal"
private const val SUBPATH_ADDRESSBOOK = "/addressbooks/my-contacts"
private const val SUBPATH_ADDRESSBOOK_INACCESSIBLE = "/addressbooks/inaccessible-contacts"
}
class TestDispatcher(
private val logger: Logger
): Dispatcher() {
override fun dispatch(request: RecordedRequest): MockResponse {
val path = request.path!!.trimEnd('/')
logger.info("${request.method} on $path")
if (request.method.equals("PROPFIND", true)) {
val properties = when (path) {
PATH_CARDDAV + SUBPATH_ADDRESSBOOK ->
"<resourcetype>" +
" <collection/>" +
" <CARD:addressbook/>" +
"</resourcetype>" +
"<displayname>My Contacts</displayname>" +
"<CARD:addressbook-description>My Contacts Description</CARD:addressbook-description>" +
"<owner>" +
" <href>${PATH_CARDDAV + SUBPATH_PRINCIPAL}</href>" +
"</owner>"
else -> ""
}
return MockResponse()
.setResponseCode(207)
.setBody("<multistatus xmlns='DAV:' xmlns:CARD='urn:ietf:params:xml:ns:carddav' xmlns:CAL='urn:ietf:params:xml:ns:caldav'>" +
"<response>" +
" <href>$path</href>" +
" <propstat><prop>"+
properties +
" </prop></propstat>" +
"</response>" +
"</multistatus>")
}
return MockResponse().setResponseCode(404)
}
}
}

View file

@ -0,0 +1,230 @@
/*
* Copyright © All Contributors. See LICENSE and AUTHORS in the root directory for details.
*/
package at.bitfire.davdroid.servicedetection
import android.security.NetworkSecurityPolicy
import at.bitfire.dav4jvm.DavResource
import at.bitfire.dav4jvm.property.carddav.AddressbookHomeSet
import at.bitfire.dav4jvm.property.webdav.ResourceType
import at.bitfire.davdroid.network.HttpClient
import at.bitfire.davdroid.servicedetection.DavResourceFinder.Configuration.ServiceInfo
import at.bitfire.davdroid.settings.Credentials
import at.bitfire.davdroid.util.SensitiveString.Companion.toSensitiveString
import dagger.hilt.android.testing.HiltAndroidRule
import dagger.hilt.android.testing.HiltAndroidTest
import okhttp3.mockwebserver.Dispatcher
import okhttp3.mockwebserver.MockResponse
import okhttp3.mockwebserver.MockWebServer
import okhttp3.mockwebserver.RecordedRequest
import org.junit.After
import org.junit.Assert.assertArrayEquals
import org.junit.Assert.assertEquals
import org.junit.Assert.assertFalse
import org.junit.Assert.assertNull
import org.junit.Assert.assertTrue
import org.junit.Assume
import org.junit.Before
import org.junit.Rule
import org.junit.Test
import java.net.URI
import java.util.logging.Logger
import javax.inject.Inject
@HiltAndroidTest
class DavResourceFinderTest {
companion object {
private const val PATH_NO_DAV = "/nodav"
private const val PATH_CALDAV = "/caldav"
private const val PATH_CARDDAV = "/carddav"
private const val PATH_CALDAV_AND_CARDDAV = "/both-caldav-carddav"
private const val SUBPATH_PRINCIPAL = "/principal"
private const val SUBPATH_ADDRESSBOOK_HOMESET = "/addressbooks"
private const val SUBPATH_ADDRESSBOOK = "/addressbooks/private-contacts"
}
@get:Rule
val hiltRule = HiltAndroidRule(this)
@Inject
lateinit var httpClientBuilder: HttpClient.Builder
@Inject
lateinit var logger: Logger
@Inject
lateinit var resourceFinderFactory: DavResourceFinder.Factory
private lateinit var server: MockWebServer
private lateinit var client: HttpClient
private lateinit var finder: DavResourceFinder
@Before
fun setUp() {
hiltRule.inject()
server = MockWebServer().apply {
dispatcher = TestDispatcher(logger)
start()
}
val credentials = Credentials(username = "mock", password = "12345".toSensitiveString())
client = httpClientBuilder
.authenticate(host = null, getCredentials = { credentials })
.build()
Assume.assumeTrue(NetworkSecurityPolicy.getInstance().isCleartextTrafficPermitted)
val baseURI = URI.create("/")
finder = resourceFinderFactory.create(baseURI, credentials)
}
@After
fun tearDown() {
client.close()
server.shutdown()
}
@Test
fun testRememberIfAddressBookOrHomeset() {
// recognize home set
var info = ServiceInfo()
DavResource(client.okHttpClient, server.url(PATH_CARDDAV + SUBPATH_PRINCIPAL))
.propfind(0, AddressbookHomeSet.NAME) { response, _ ->
finder.scanResponse(ResourceType.ADDRESSBOOK, response, info)
}
assertEquals(0, info.collections.size)
assertEquals(1, info.homeSets.size)
assertEquals(server.url("$PATH_CARDDAV$SUBPATH_ADDRESSBOOK_HOMESET/"), info.homeSets.first())
// recognize address book
info = ServiceInfo()
DavResource(client.okHttpClient, server.url(PATH_CARDDAV + SUBPATH_ADDRESSBOOK))
.propfind(0, ResourceType.NAME) { response, _ ->
finder.scanResponse(ResourceType.ADDRESSBOOK, response, info)
}
assertEquals(1, info.collections.size)
assertEquals(server.url("$PATH_CARDDAV$SUBPATH_ADDRESSBOOK/"), info.collections.keys.first())
assertEquals(0, info.homeSets.size)
}
@Test
fun testProvidesService() {
assertFalse(finder.providesService(server.url(PATH_NO_DAV), DavResourceFinder.Service.CALDAV))
assertFalse(finder.providesService(server.url(PATH_NO_DAV), DavResourceFinder.Service.CARDDAV))
assertTrue(finder.providesService(server.url(PATH_CALDAV), DavResourceFinder.Service.CALDAV))
assertFalse(finder.providesService(server.url(PATH_CALDAV), DavResourceFinder.Service.CARDDAV))
assertTrue(finder.providesService(server.url(PATH_CARDDAV), DavResourceFinder.Service.CARDDAV))
assertFalse(finder.providesService(server.url(PATH_CARDDAV), DavResourceFinder.Service.CALDAV))
assertTrue(finder.providesService(server.url(PATH_CALDAV_AND_CARDDAV), DavResourceFinder.Service.CALDAV))
assertTrue(finder.providesService(server.url(PATH_CALDAV_AND_CARDDAV), DavResourceFinder.Service.CARDDAV))
}
@Test
fun testGetCurrentUserPrincipal() {
assertNull(finder.getCurrentUserPrincipal(server.url(PATH_NO_DAV), DavResourceFinder.Service.CALDAV))
assertNull(finder.getCurrentUserPrincipal(server.url(PATH_NO_DAV), DavResourceFinder.Service.CARDDAV))
assertEquals(
server.url(PATH_CALDAV + SUBPATH_PRINCIPAL),
finder.getCurrentUserPrincipal(server.url(PATH_CALDAV), DavResourceFinder.Service.CALDAV)
)
assertNull(finder.getCurrentUserPrincipal(server.url(PATH_CALDAV), DavResourceFinder.Service.CARDDAV))
assertEquals(
server.url(PATH_CARDDAV + SUBPATH_PRINCIPAL),
finder.getCurrentUserPrincipal(server.url(PATH_CARDDAV), DavResourceFinder.Service.CARDDAV)
)
assertNull(finder.getCurrentUserPrincipal(server.url(PATH_CARDDAV), DavResourceFinder.Service.CALDAV))
}
@Test
fun testQueryEmailAddress() {
var info = ServiceInfo()
assertArrayEquals(
arrayOf("email1@example.com", "email2@example.com"),
finder.queryEmailAddress(server.url(PATH_CALDAV + SUBPATH_PRINCIPAL)).toTypedArray()
)
assertTrue(finder.queryEmailAddress(server.url(PATH_CARDDAV + SUBPATH_PRINCIPAL)).isEmpty())
}
// mock server
class TestDispatcher(
private val logger: Logger
): Dispatcher() {
override fun dispatch(request: RecordedRequest): MockResponse {
if (!checkAuth(request)) {
val authenticate = MockResponse().setResponseCode(401)
authenticate.setHeader("WWW-Authenticate", "Basic realm=\"test\"")
return authenticate
}
val path = request.path!!
if (request.method.equals("OPTIONS", true)) {
val dav = when {
path.startsWith(PATH_CALDAV) -> "calendar-access"
path.startsWith(PATH_CARDDAV) -> "addressbook"
path.startsWith(PATH_CALDAV_AND_CARDDAV) -> "calendar-access, addressbook"
else -> null
}
val response = MockResponse().setResponseCode(200)
if (dav != null)
response.addHeader("DAV", dav)
return response
} else if (request.method.equals("PROPFIND", true)) {
val props: String?
when (path) {
PATH_CALDAV,
PATH_CARDDAV ->
props = "<current-user-principal><href>$path$SUBPATH_PRINCIPAL</href></current-user-principal>"
PATH_CARDDAV + SUBPATH_PRINCIPAL ->
props = "<CARD:addressbook-home-set>" +
" <href>$PATH_CARDDAV$SUBPATH_ADDRESSBOOK_HOMESET</href>" +
"</CARD:addressbook-home-set>"
PATH_CARDDAV + SUBPATH_ADDRESSBOOK ->
props = "<resourcetype>" +
" <collection/>" +
" <CARD:addressbook/>" +
"</resourcetype>"
PATH_CALDAV + SUBPATH_PRINCIPAL ->
props = "<CAL:calendar-user-address-set>" +
" <href>urn:unknown-entry</href>" +
" <href>mailto:email1@example.com</href>" +
" <href>mailto:email2@example.com</href>" +
"</CAL:calendar-user-address-set>"
else -> props = null
}
logger.info("Sending props: $props")
return MockResponse()
.setResponseCode(207)
.setBody("<multistatus xmlns='DAV:' xmlns:CARD='urn:ietf:params:xml:ns:carddav' xmlns:CAL='urn:ietf:params:xml:ns:caldav'>" +
"<response>" +
" <href>${request.path}</href>" +
" <propstat><prop>$props</prop></propstat>" +
"</response>" +
"</multistatus>")
}
return MockResponse().setResponseCode(404)
}
private fun checkAuth(rq: RecordedRequest) =
rq.getHeader("Authorization") == "Basic bW9jazoxMjM0NQ=="
}
}

View file

@ -0,0 +1,473 @@
/*
* Copyright © All Contributors. See LICENSE and AUTHORS in the root directory for details.
*/
package at.bitfire.davdroid.servicedetection
import android.security.NetworkSecurityPolicy
import at.bitfire.davdroid.db.AppDatabase
import at.bitfire.davdroid.db.Collection
import at.bitfire.davdroid.db.HomeSet
import at.bitfire.davdroid.db.Service
import at.bitfire.davdroid.network.HttpClient
import at.bitfire.davdroid.settings.Settings
import at.bitfire.davdroid.settings.SettingsManager
import dagger.hilt.android.testing.BindValue
import dagger.hilt.android.testing.HiltAndroidRule
import dagger.hilt.android.testing.HiltAndroidTest
import io.mockk.every
import io.mockk.impl.annotations.MockK
import io.mockk.junit4.MockKRule
import junit.framework.TestCase.assertFalse
import junit.framework.TestCase.assertTrue
import kotlinx.coroutines.test.runTest
import okhttp3.mockwebserver.Dispatcher
import okhttp3.mockwebserver.MockResponse
import okhttp3.mockwebserver.MockWebServer
import okhttp3.mockwebserver.RecordedRequest
import org.junit.After
import org.junit.Assert.assertEquals
import org.junit.Assume
import org.junit.Before
import org.junit.Rule
import org.junit.Test
import java.util.logging.Logger
import javax.inject.Inject
@HiltAndroidTest
class HomeSetRefresherTest {
@get:Rule
val hiltRule = HiltAndroidRule(this)
@get:Rule
val mockKRule = MockKRule(this)
@Inject
lateinit var db: AppDatabase
@Inject
lateinit var httpClientBuilder: HttpClient.Builder
@Inject
lateinit var logger: Logger
@Inject
lateinit var homeSetRefresherFactory: HomeSetRefresher.Factory
@BindValue
@MockK(relaxed = true)
lateinit var settings: SettingsManager
private lateinit var client: HttpClient
private lateinit var mockServer: MockWebServer
private lateinit var service: Service
@Before
fun setUp() {
hiltRule.inject()
// Start mock web server
mockServer = MockWebServer().apply {
dispatcher = TestDispatcher(logger)
start()
}
// build HTTP client
client = httpClientBuilder.build()
Assume.assumeTrue(NetworkSecurityPolicy.getInstance().isCleartextTrafficPermitted)
// insert test service
val serviceId = db.serviceDao().insertOrReplace(
Service(id = 0, accountName = "test", type = Service.TYPE_CARDDAV, principal = null)
)
service = db.serviceDao().get(serviceId)!!
}
@After
fun tearDown() {
client.close()
mockServer.shutdown()
}
// refreshHomesetsAndTheirCollections
@Test
fun refreshHomesetsAndTheirCollections_addsNewCollection() = runTest {
// save homeset in DB
val homesetId = db.homeSetDao().insert(
HomeSet(id = 0, service.id, true, mockServer.url("$PATH_CARDDAV$SUBPATH_ADDRESSBOOK_HOMESET_PERSONAL"))
)
// Refresh
homeSetRefresherFactory.create(service, client.okHttpClient)
.refreshHomesetsAndTheirCollections()
// Check the collection defined in homeset is now in the database
assertEquals(
Collection(
1,
service.id,
homesetId,
1, // will have gotten an owner too
Collection.TYPE_ADDRESSBOOK,
mockServer.url("$PATH_CARDDAV$SUBPATH_ADDRESSBOOK/"),
displayName = "My Contacts",
description = "My Contacts Description"
),
db.collectionDao().getByService(service.id).first()
)
}
@Test
fun refreshHomesetsAndTheirCollections_updatesExistingCollection() {
// save "old" collection in DB
val collectionId = db.collectionDao().insertOrUpdateByUrl(
Collection(
0,
service.id,
null,
null,
Collection.TYPE_ADDRESSBOOK,
mockServer.url("$PATH_CARDDAV$SUBPATH_ADDRESSBOOK/"),
displayName = "My Contacts",
description = "My Contacts Description"
)
)
// Refresh
homeSetRefresherFactory.create(service, client.okHttpClient).refreshHomesetsAndTheirCollections()
// Check the collection got updated
assertEquals(
Collection(
collectionId,
service.id,
null,
null,
Collection.TYPE_ADDRESSBOOK,
mockServer.url("$PATH_CARDDAV$SUBPATH_ADDRESSBOOK/"),
displayName = "My Contacts",
description = "My Contacts Description"
),
db.collectionDao().get(collectionId)
)
}
@Test
fun refreshHomesetsAndTheirCollections_preservesCollectionFlags() {
// save "old" collection in DB - with set flags
val collectionId = db.collectionDao().insertOrUpdateByUrl(
Collection(
0,
service.id,
null,
null,
Collection.TYPE_ADDRESSBOOK,
mockServer.url("$PATH_CARDDAV$SUBPATH_ADDRESSBOOK/"),
displayName = "My Contacts",
description = "My Contacts Description",
forceReadOnly = true,
sync = true
)
)
// Refresh
homeSetRefresherFactory.create(service, client.okHttpClient).refreshHomesetsAndTheirCollections()
// Check the collection got updated
assertEquals(
Collection(
collectionId,
service.id,
null,
null,
Collection.TYPE_ADDRESSBOOK,
mockServer.url("$PATH_CARDDAV$SUBPATH_ADDRESSBOOK/"),
displayName = "My Contacts",
description = "My Contacts Description",
forceReadOnly = true,
sync = true
),
db.collectionDao().get(collectionId)
)
}
@Test
fun refreshHomesetsAndTheirCollections_marksRemovedCollectionsAsHomeless() {
// save homeset in DB - which is empty (zero address books) on the serverside
val homesetId = db.homeSetDao().insert(
HomeSet(id = 0, service.id, true, mockServer.url("$PATH_CARDDAV$SUBPATH_ADDRESSBOOK_HOMESET_EMPTY"))
)
// place collection in DB - as part of the homeset
val collectionId = db.collectionDao().insertOrUpdateByUrl(
Collection(
0,
service.id,
homesetId,
null,
Collection.TYPE_ADDRESSBOOK,
mockServer.url("$PATH_CARDDAV$SUBPATH_ADDRESSBOOK/")
)
)
// Refresh - should mark collection as homeless, because serverside homeset is empty.
homeSetRefresherFactory.create(service, client.okHttpClient).refreshHomesetsAndTheirCollections()
// Check the collection, is now marked as homeless
assertEquals(null, db.collectionDao().get(collectionId)!!.homeSetId)
}
@Test
fun refreshHomesetsAndTheirCollections_addsOwnerUrls() {
// save a homeset in DB
val homesetId = db.homeSetDao().insert(
HomeSet(id = 0, service.id, true, mockServer.url("$PATH_CARDDAV$SUBPATH_ADDRESSBOOK_HOMESET_PERSONAL"))
)
// place collection in DB - as part of the homeset
val collectionId = db.collectionDao().insertOrUpdateByUrl(
Collection(
0,
service.id,
homesetId, // part of above home set
null,
Collection.TYPE_ADDRESSBOOK,
mockServer.url("$PATH_CARDDAV$SUBPATH_ADDRESSBOOK/")
)
)
// Refresh - homesets and their collections
assertEquals(0, db.principalDao().getByService(service.id).size)
homeSetRefresherFactory.create(service, client.okHttpClient).refreshHomesetsAndTheirCollections()
// Check principal saved and the collection was updated with its reference
val principals = db.principalDao().getByService(service.id)
assertEquals(1, principals.size)
assertEquals(mockServer.url("$PATH_CARDDAV$SUBPATH_PRINCIPAL"), principals[0].url)
assertEquals(null, principals[0].displayName)
assertEquals(
principals[0].id,
db.collectionDao().get(collectionId)!!.ownerId
)
}
// other
@Test
fun shouldPreselect_none() {
every { settings.getIntOrNull(Settings.PRESELECT_COLLECTIONS) } returns Settings.PRESELECT_COLLECTIONS_NONE
every { settings.getString(Settings.PRESELECT_COLLECTIONS_EXCLUDED) } returns ""
val collection = Collection(
0,
service.id,
0,
type = Collection.TYPE_ADDRESSBOOK,
url = mockServer.url("/addressbook-homeset/addressbook/")
)
val homesets = listOf(
HomeSet(
id = 0,
serviceId = service.id,
personal = true,
url = mockServer.url("/addressbook-homeset/")
)
)
val refresher = homeSetRefresherFactory.create(service, client.okHttpClient)
assertFalse(refresher.shouldPreselect(collection, homesets))
}
@Test
fun shouldPreselect_all() {
every { settings.getIntOrNull(Settings.PRESELECT_COLLECTIONS) } returns Settings.PRESELECT_COLLECTIONS_ALL
every { settings.getString(Settings.PRESELECT_COLLECTIONS_EXCLUDED) } returns ""
val collection = Collection(
0,
service.id,
0,
type = Collection.TYPE_ADDRESSBOOK,
url = mockServer.url("/addressbook-homeset/addressbook/")
)
val homesets = listOf(
HomeSet(
id = 0,
serviceId = service.id,
personal = false,
url = mockServer.url("/addressbook-homeset/")
)
)
val refresher = homeSetRefresherFactory.create(service, client.okHttpClient)
assertTrue(refresher.shouldPreselect(collection, homesets))
}
@Test
fun shouldPreselect_all_blacklisted() {
val url = mockServer.url("/addressbook-homeset/addressbook/")
every { settings.getIntOrNull(Settings.PRESELECT_COLLECTIONS) } returns Settings.PRESELECT_COLLECTIONS_ALL
every { settings.getString(Settings.PRESELECT_COLLECTIONS_EXCLUDED) } returns url.toString()
val collection = Collection(
id = 0,
serviceId = service.id,
homeSetId = 0,
type = Collection.TYPE_ADDRESSBOOK,
url = url
)
val homesets = listOf(
HomeSet(
id = 0,
serviceId = service.id,
personal = false,
url = mockServer.url("/addressbook-homeset/")
)
)
val refresher = homeSetRefresherFactory.create(service, client.okHttpClient)
assertFalse(refresher.shouldPreselect(collection, homesets))
}
@Test
fun shouldPreselect_personal_notPersonal() {
every { settings.getIntOrNull(Settings.PRESELECT_COLLECTIONS) } returns Settings.PRESELECT_COLLECTIONS_PERSONAL
every { settings.getString(Settings.PRESELECT_COLLECTIONS_EXCLUDED) } returns ""
val collection = Collection(
id = 0,
serviceId = service.id,
homeSetId = 0,
type = Collection.TYPE_ADDRESSBOOK,
url = mockServer.url("/addressbook-homeset/addressbook/")
)
val homesets = listOf(
HomeSet(
id = 0,
serviceId = service.id,
personal = false,
url = mockServer.url("/addressbook-homeset/")
)
)
val refresher = homeSetRefresherFactory.create(service, client.okHttpClient)
assertFalse(refresher.shouldPreselect(collection, homesets))
}
@Test
fun shouldPreselect_personal_isPersonal() {
every { settings.getIntOrNull(Settings.PRESELECT_COLLECTIONS) } returns Settings.PRESELECT_COLLECTIONS_PERSONAL
every { settings.getString(Settings.PRESELECT_COLLECTIONS_EXCLUDED) } returns ""
val collection = Collection(
0,
service.id,
0,
type = Collection.TYPE_ADDRESSBOOK,
url = mockServer.url("/addressbook-homeset/addressbook/")
)
val homesets = listOf(
HomeSet(
id = 0,
serviceId = service.id,
personal = true,
url = mockServer.url("/addressbook-homeset/")
)
)
val refresher = homeSetRefresherFactory.create(service, client.okHttpClient)
assertTrue(refresher.shouldPreselect(collection, homesets))
}
@Test
fun shouldPreselect_personal_isPersonalButBlacklisted() {
val collectionUrl = mockServer.url("/addressbook-homeset/addressbook/")
every { settings.getIntOrNull(Settings.PRESELECT_COLLECTIONS) } returns Settings.PRESELECT_COLLECTIONS_PERSONAL
every { settings.getString(Settings.PRESELECT_COLLECTIONS_EXCLUDED) } returns collectionUrl.toString()
val collection = Collection(
id = 0,
serviceId = service.id,
homeSetId = 0,
type = Collection.TYPE_ADDRESSBOOK,
url = collectionUrl
)
val homesets = listOf(
HomeSet(
id = 0,
serviceId = service.id,
personal = true,
url = mockServer.url("/addressbook-homeset/")
)
)
val refresher = homeSetRefresherFactory.create(service, client.okHttpClient)
assertFalse(refresher.shouldPreselect(collection, homesets))
}
companion object {
private const val PATH_CARDDAV = "/carddav"
private const val SUBPATH_PRINCIPAL = "/principal"
private const val SUBPATH_ADDRESSBOOK_HOMESET_PERSONAL = "/addressbooks-homeset"
private const val SUBPATH_ADDRESSBOOK_HOMESET_EMPTY = "/addressbooks-homeset-empty"
private const val SUBPATH_ADDRESSBOOK = "/addressbooks/my-contacts"
}
class TestDispatcher(
private val logger: Logger
) : Dispatcher() {
override fun dispatch(request: RecordedRequest): MockResponse {
val path = request.path!!.trimEnd('/')
if (request.method.equals("PROPFIND", true)) {
val properties = when (path) {
PATH_CARDDAV + SUBPATH_ADDRESSBOOK_HOMESET_PERSONAL ->
"<resourcetype>" +
" <collection/>" +
" <CARD:addressbook/>" +
"</resourcetype>" +
"<displayname>My Contacts</displayname>" +
"<CARD:addressbook-description>My Contacts Description</CARD:addressbook-description>" +
"<owner>" +
" <href>${PATH_CARDDAV + SUBPATH_PRINCIPAL}</href>" +
"</owner>"
SUBPATH_ADDRESSBOOK_HOMESET_EMPTY -> ""
else -> ""
}
logger.info("Queried: $path")
return MockResponse()
.setResponseCode(207)
.setBody(
"<multistatus xmlns='DAV:' xmlns:CARD='urn:ietf:params:xml:ns:carddav' xmlns:CAL='urn:ietf:params:xml:ns:caldav'>" +
"<response>" +
" <href>${PATH_CARDDAV + SUBPATH_ADDRESSBOOK}</href>" +
" <propstat><prop>" +
properties +
" </prop></propstat>" +
" <status>HTTP/1.1 200 OK</status>" +
"</response>" +
"</multistatus>"
)
}
return MockResponse().setResponseCode(404)
}
}
}

View file

@ -0,0 +1,236 @@
/*
* Copyright © All Contributors. See LICENSE and AUTHORS in the root directory for details.
*/
package at.bitfire.davdroid.servicedetection
import android.security.NetworkSecurityPolicy
import at.bitfire.davdroid.db.AppDatabase
import at.bitfire.davdroid.db.Collection
import at.bitfire.davdroid.db.Principal
import at.bitfire.davdroid.db.Service
import at.bitfire.davdroid.network.HttpClient
import at.bitfire.davdroid.settings.SettingsManager
import dagger.hilt.android.testing.BindValue
import dagger.hilt.android.testing.HiltAndroidRule
import dagger.hilt.android.testing.HiltAndroidTest
import io.mockk.impl.annotations.MockK
import io.mockk.junit4.MockKRule
import junit.framework.TestCase.assertEquals
import okhttp3.mockwebserver.Dispatcher
import okhttp3.mockwebserver.MockResponse
import okhttp3.mockwebserver.MockWebServer
import okhttp3.mockwebserver.RecordedRequest
import org.junit.After
import org.junit.Assume
import org.junit.Before
import org.junit.Rule
import org.junit.Test
import java.util.logging.Logger
import javax.inject.Inject
@HiltAndroidTest
class PrincipalsRefresherTest {
@Inject
lateinit var db: AppDatabase
@Inject
lateinit var httpClientBuilder: HttpClient.Builder
@Inject
lateinit var logger: Logger
@Inject
lateinit var principalsRefresher: PrincipalsRefresher.Factory
@BindValue
@MockK(relaxed = true)
lateinit var settings: SettingsManager
@get:Rule
val hiltRule = HiltAndroidRule(this)
@get:Rule
val mockKRule = MockKRule(this)
private lateinit var client: HttpClient
private lateinit var mockServer: MockWebServer
private lateinit var service: Service
@Before
fun setUp() {
hiltRule.inject()
// Start mock web server
mockServer = MockWebServer().apply {
dispatcher = TestDispatcher(logger)
start()
}
// build HTTP client
client = httpClientBuilder.build()
Assume.assumeTrue(NetworkSecurityPolicy.getInstance().isCleartextTrafficPermitted)
// insert test service
val serviceId = db.serviceDao().insertOrReplace(
Service(id = 0, accountName = "test", type = Service.TYPE_CARDDAV, principal = null)
)
service = db.serviceDao().get(serviceId)!!
}
@After
fun tearDown() {
client.close()
mockServer.shutdown()
}
@Test
fun refreshPrincipals_inaccessiblePrincipal() {
// place principal without display name in db
val principalId = db.principalDao().insert(
Principal(
0,
service.id,
mockServer.url("$PATH_CARDDAV$SUBPATH_PRINCIPAL_INACCESSIBLE"), // no trailing slash
null // no display name for now
)
)
// add an associated collection - as the principal is rightfully removed otherwise
db.collectionDao().insertOrUpdateByUrl(
Collection(
0,
service.id,
null,
principalId, // create association with principal
Collection.TYPE_ADDRESSBOOK,
mockServer.url("$PATH_CARDDAV$SUBPATH_ADDRESSBOOK/"), // with trailing slash
)
)
// Refresh principals
principalsRefresher.create(service, client.okHttpClient).refreshPrincipals()
// Check principal was not updated
val principals = db.principalDao().getByService(service.id)
assertEquals(1, principals.size)
assertEquals(mockServer.url("$PATH_CARDDAV$SUBPATH_PRINCIPAL_INACCESSIBLE"), principals[0].url)
assertEquals(null, principals[0].displayName)
}
@Test
fun refreshPrincipals_updatesPrincipal() {
// place principal without display name in db
val principalId = db.principalDao().insert(
Principal(
0,
service.id,
mockServer.url("$PATH_CARDDAV$SUBPATH_PRINCIPAL"), // no trailing slash
null // no display name for now
)
)
// add an associated collection - as the principal is rightfully removed otherwise
db.collectionDao().insertOrUpdateByUrl(
Collection(
0,
service.id,
null,
principalId, // create association with principal
Collection.TYPE_ADDRESSBOOK,
mockServer.url("$PATH_CARDDAV$SUBPATH_ADDRESSBOOK/"), // with trailing slash
)
)
// Refresh principals
principalsRefresher.create(service, client.okHttpClient).refreshPrincipals()
// Check principal now got a display name
val principals = db.principalDao().getByService(service.id)
assertEquals(1, principals.size)
assertEquals(mockServer.url("$PATH_CARDDAV$SUBPATH_PRINCIPAL"), principals[0].url)
assertEquals("Mr. Wobbles", principals[0].displayName)
}
@Test
fun refreshPrincipals_deletesPrincipalsWithoutCollections() {
// place principal without collections in DB
db.principalDao().insert(
Principal(
0,
service.id,
mockServer.url("$PATH_CARDDAV$SUBPATH_PRINCIPAL_WITHOUT_COLLECTIONS/")
)
)
// Refresh principals - detecting it does not own collections
principalsRefresher.create(service, client.okHttpClient).refreshPrincipals()
// Check principal was deleted
val principals = db.principalDao().getByService(service.id)
assertEquals(0, principals.size)
}
companion object {
private const val PATH_CARDDAV = "/carddav"
private const val SUBPATH_PRINCIPAL = "/principal"
private const val SUBPATH_PRINCIPAL_INACCESSIBLE = "/inaccessible-principal"
private const val SUBPATH_PRINCIPAL_WITHOUT_COLLECTIONS = "/principal2"
private const val SUBPATH_GROUPPRINCIPAL_0 = "/groups/0"
private const val SUBPATH_ADDRESSBOOK_HOMESET_PERSONAL = "/addressbooks-homeset"
private const val SUBPATH_ADDRESSBOOK_HOMESET_EMPTY = "/addressbooks-homeset-empty"
private const val SUBPATH_ADDRESSBOOK = "/addressbooks/my-contacts"
}
class TestDispatcher(
private val logger: Logger
) : Dispatcher() {
override fun dispatch(request: RecordedRequest): MockResponse {
val path = request.path!!.trimEnd('/')
if (request.method.equals("PROPFIND", true)) {
val properties = when (path) {
PATH_CARDDAV + SUBPATH_PRINCIPAL ->
"<resourcetype><principal/></resourcetype>" +
"<displayname>Mr. Wobbles</displayname>" + "<CARD:addressbook-home-set>" + " <href>${PATH_CARDDAV}${SUBPATH_ADDRESSBOOK_HOMESET_PERSONAL}</href>" + "</CARD:addressbook-home-set>" + "<group-membership>" + " <href>${PATH_CARDDAV}${SUBPATH_GROUPPRINCIPAL_0}</href>" +
"</group-membership>"
PATH_CARDDAV + SUBPATH_PRINCIPAL_WITHOUT_COLLECTIONS ->
"<CARD:addressbook-home-set>" +
" <href>${PATH_CARDDAV}${SUBPATH_ADDRESSBOOK_HOMESET_EMPTY}</href>" +
"</CARD:addressbook-home-set>" +
"<displayname>Mr. Wobbles Jr.</displayname>"
SUBPATH_ADDRESSBOOK_HOMESET_EMPTY -> ""
else -> ""
}
logger.info("Queried: $path")
return MockResponse()
.setResponseCode(207)
.setBody(
"<multistatus xmlns='DAV:' xmlns:CARD='urn:ietf:params:xml:ns:carddav' xmlns:CAL='urn:ietf:params:xml:ns:caldav'>" +
"<response>" +
" <href>$path</href>" +
" <propstat><prop>" +
properties +
" </prop></propstat>" +
"</response>" +
"</multistatus>"
)
}
return MockResponse().setResponseCode(404)
}
}
}

View file

@ -0,0 +1,163 @@
/*
* Copyright © All Contributors. See LICENSE and AUTHORS in the root directory for details.
*/
package at.bitfire.davdroid.servicedetection
import android.security.NetworkSecurityPolicy
import at.bitfire.davdroid.db.AppDatabase
import at.bitfire.davdroid.db.Service
import at.bitfire.davdroid.network.HttpClient
import dagger.hilt.android.testing.HiltAndroidRule
import dagger.hilt.android.testing.HiltAndroidTest
import okhttp3.mockwebserver.Dispatcher
import okhttp3.mockwebserver.MockResponse
import okhttp3.mockwebserver.MockWebServer
import okhttp3.mockwebserver.RecordedRequest
import org.junit.After
import org.junit.Assert.assertEquals
import org.junit.Assume
import org.junit.Before
import org.junit.Rule
import org.junit.Test
import java.util.logging.Logger
import javax.inject.Inject
@HiltAndroidTest
class ServiceRefresherTest {
@get:Rule
val hiltRule = HiltAndroidRule(this)
@Inject
lateinit var db: AppDatabase
@Inject
lateinit var httpClientBuilder: HttpClient.Builder
@Inject
lateinit var logger: Logger
@Inject
lateinit var serviceRefresherFactory: ServiceRefresher.Factory
private lateinit var client: HttpClient
private lateinit var mockServer: MockWebServer
private lateinit var service: Service
@Before
fun setUp() {
hiltRule.inject()
// Start mock web server
mockServer = MockWebServer().apply {
dispatcher = TestDispatcher(logger)
start()
}
// build HTTP client
client = httpClientBuilder.build()
Assume.assumeTrue(NetworkSecurityPolicy.getInstance().isCleartextTrafficPermitted)
// insert test service
val serviceId = db.serviceDao().insertOrReplace(
Service(id = 0, accountName = "test", type = Service.TYPE_CARDDAV, principal = null)
)
service = db.serviceDao().get(serviceId)!!
}
@After
fun tearDown() {
client.close()
mockServer.shutdown()
}
@Test
fun testDiscoverHomesets() {
val baseUrl = mockServer.url(PATH_CARDDAV + SUBPATH_PRINCIPAL)
// Query home sets
serviceRefresherFactory.create(service, client.okHttpClient)
.discoverHomesets(baseUrl)
// Check home set has been saved correctly to database
val savedHomesets = db.homeSetDao().getByService(service.id)
assertEquals(2, savedHomesets.size)
// Home set from current-user-principal
val personalHomeset = savedHomesets[1]
assertEquals(mockServer.url("$PATH_CARDDAV$SUBPATH_ADDRESSBOOK_HOMESET_PERSONAL/"), personalHomeset.url)
assertEquals(service.id, personalHomeset.serviceId)
// personal should be true for homesets detected at first query of current-user-principal (Even if they occur in a group principal as well!!!)
assertEquals(true, personalHomeset.personal)
// Home set found in a group principal
val groupHomeset = savedHomesets[0]
assertEquals(mockServer.url("$PATH_CARDDAV$SUBPATH_ADDRESSBOOK_HOMESET_NON_PERSONAL/"), groupHomeset.url)
assertEquals(service.id, groupHomeset.serviceId)
// personal should be false for homesets not detected at the first query of current-user-principal (IE. in groups)
assertEquals(false, groupHomeset.personal)
}
companion object {
private const val PATH_CARDDAV = "/carddav"
private const val SUBPATH_PRINCIPAL = "/principal"
private const val SUBPATH_GROUPPRINCIPAL_0 = "/groups/0"
private const val SUBPATH_ADDRESSBOOK_HOMESET_PERSONAL = "/addressbooks-homeset"
private const val SUBPATH_ADDRESSBOOK_HOMESET_NON_PERSONAL = "/addressbooks-homeset-non-personal"
}
class TestDispatcher(
private val logger: Logger
) : Dispatcher() {
override fun dispatch(request: RecordedRequest): MockResponse {
val path = request.path!!.trimEnd('/')
logger.info("Query: ${request.method} on $path ")
if (request.method.equals("PROPFIND", true)) {
val properties = when (path) {
PATH_CARDDAV + SUBPATH_PRINCIPAL ->
"<resourcetype><principal/></resourcetype>" +
"<displayname>Mr. Wobbles</displayname>" +
"<CARD:addressbook-home-set>" +
" <href>${PATH_CARDDAV}${SUBPATH_ADDRESSBOOK_HOMESET_PERSONAL}</href>" +
"</CARD:addressbook-home-set>" +
"<group-membership>" +
" <href>${PATH_CARDDAV}${SUBPATH_GROUPPRINCIPAL_0}</href>" +
"</group-membership>"
PATH_CARDDAV + SUBPATH_GROUPPRINCIPAL_0 ->
"<resourcetype><principal/></resourcetype>" +
"<displayname>All address books</displayname>" +
"<CARD:addressbook-home-set>" +
" <href>${PATH_CARDDAV}${SUBPATH_ADDRESSBOOK_HOMESET_PERSONAL}</href>" +
" <href>${PATH_CARDDAV}${SUBPATH_ADDRESSBOOK_HOMESET_NON_PERSONAL}</href>" +
"</CARD:addressbook-home-set>"
else -> ""
}
return MockResponse()
.setResponseCode(207)
.setBody(
"<multistatus xmlns='DAV:' xmlns:CARD='urn:ietf:params:xml:ns:carddav' xmlns:CAL='urn:ietf:params:xml:ns:caldav'>" +
"<response>" +
" <href>$path</href>" +
" <propstat><prop>" +
properties +
" </prop></propstat>" +
"</response>" +
"</multistatus>"
)
}
return MockResponse().setResponseCode(404)
}
}
}

View file

@ -0,0 +1,60 @@
/*
* Copyright © All Contributors. See LICENSE and AUTHORS in the root directory for details.
*/
package at.bitfire.davdroid.settings
import android.accounts.AccountManager
import android.content.Context
import at.bitfire.davdroid.TestUtils
import at.bitfire.davdroid.sync.account.TestAccount
import dagger.hilt.android.qualifiers.ApplicationContext
import dagger.hilt.android.testing.HiltAndroidRule
import dagger.hilt.android.testing.HiltAndroidTest
import org.junit.Assert.assertEquals
import org.junit.Before
import org.junit.Rule
import org.junit.Test
import javax.inject.Inject
@HiltAndroidTest
class AccountSettingsTest {
@Inject @ApplicationContext
lateinit var context: Context
@Inject
lateinit var accountSettingsFactory: AccountSettings.Factory
@get:Rule
val hiltRule = HiltAndroidRule(this)
@Before
fun setUp() {
hiltRule.inject()
TestUtils.setUpWorkManager(context)
}
@Test(expected = IllegalArgumentException::class)
fun testUpdate_MissingMigrations() {
TestAccount.provide(version = 1) { account ->
// will run AccountSettings.update
accountSettingsFactory.create(account, abortOnMissingMigration = true)
}
}
@Test
fun testUpdate_RunAllMigrations() {
TestAccount.provide(version = 6) { account ->
// will run AccountSettings.update
accountSettingsFactory.create(account, abortOnMissingMigration = true)
val accountManager = AccountManager.get(context)
val version = accountManager.getUserData(account, AccountSettings.KEY_SETTINGS_VERSION).toInt()
assertEquals(AccountSettings.CURRENT_VERSION, version)
}
}
}

View file

@ -0,0 +1,93 @@
/*
* Copyright © All Contributors. See LICENSE and AUTHORS in the root directory for details.
*/
package at.bitfire.davdroid.settings
import androidx.arch.core.executor.testing.InstantTaskExecutorRule
import dagger.hilt.android.testing.HiltAndroidRule
import dagger.hilt.android.testing.HiltAndroidTest
import kotlinx.coroutines.flow.first
import kotlinx.coroutines.flow.take
import kotlinx.coroutines.flow.toSet
import kotlinx.coroutines.test.runTest
import org.junit.After
import org.junit.Assert.assertEquals
import org.junit.Assert.assertFalse
import org.junit.Before
import org.junit.Rule
import org.junit.Test
import javax.inject.Inject
@HiltAndroidTest
class SettingsManagerTest {
companion object {
/** Use this setting to test SettingsManager methods. Will be removed after every test run. */
const val SETTING_TEST = "test"
}
@get:Rule
val hiltRule = HiltAndroidRule(this)
@get:Rule
val instantTaskExecutorRule = InstantTaskExecutorRule()
@Inject lateinit var settingsManager: SettingsManager
@Before
fun inject() {
hiltRule.inject()
}
@After
fun removeTestSetting() {
settingsManager.remove(SETTING_TEST)
}
@Test
fun test_containsKey_NotExisting() {
assertFalse(settingsManager.containsKey("notExisting"))
}
@Test
fun test_containsKey_Existing() {
// provided by DefaultsProvider
assertEquals(Settings.PROXY_TYPE_SYSTEM, settingsManager.getInt(Settings.PROXY_TYPE))
}
@Test
fun test_observerFlow_initialValue() = runTest {
var counter = 0
val live = settingsManager.observerFlow {
if (counter++ == 0)
23
else
throw AssertionError("A second value was requested")
}
assertEquals(23, live.first())
}
@Test
fun test_observerFlow_updatedValue() = runTest {
var counter = 0
val live = settingsManager.observerFlow {
when (counter++) {
0 -> {
// update some setting so that we will be called a second time
settingsManager.putBoolean(SETTING_TEST, true)
// and emit initial value
23
}
1 -> 42 // updated value
else -> throw AssertionError()
}
}
val result = live.take(2).toSet()
assertEquals(setOf(23, 42), result)
}
}

View file

@ -0,0 +1,102 @@
/*
* Copyright © All Contributors. See LICENSE and AUTHORS in the root directory for details.
*/
package at.bitfire.davdroid.settings.migration
import android.accounts.Account
import android.accounts.AccountManager
import android.content.Context
import androidx.test.rule.GrantPermissionRule
import at.bitfire.davdroid.R
import at.bitfire.davdroid.db.AppDatabase
import at.bitfire.davdroid.db.Collection
import at.bitfire.davdroid.db.Service
import at.bitfire.davdroid.resource.LocalAddressBook
import at.bitfire.davdroid.sync.account.TestAccount
import at.bitfire.davdroid.sync.account.setAndVerifyUserData
import dagger.hilt.android.qualifiers.ApplicationContext
import dagger.hilt.android.testing.HiltAndroidRule
import dagger.hilt.android.testing.HiltAndroidTest
import okhttp3.HttpUrl.Companion.toHttpUrl
import org.junit.Assert.assertEquals
import org.junit.Assert.assertTrue
import org.junit.Before
import org.junit.Rule
import org.junit.Test
import javax.inject.Inject
@HiltAndroidTest
class AccountSettingsMigration17Test {
@Inject @ApplicationContext
lateinit var context: Context
@Inject
lateinit var db: AppDatabase
@Inject
lateinit var migration: AccountSettingsMigration17
@get:Rule
val hiltRule = HiltAndroidRule(this)
@get:Rule
val permissionRule = GrantPermissionRule.grant(android.Manifest.permission.READ_CONTACTS, android.Manifest.permission.WRITE_CONTACTS)
@Before
fun setUp() {
hiltRule.inject()
}
@Test
fun testMigrate_OldAddressBook_CollectionInDB() {
val localAddressBookUserDataUrl = "url"
TestAccount.provide(version = 16) { account ->
val accountManager = AccountManager.get(context)
val addressBookAccountType = context.getString(R.string.account_type_address_book)
var addressBookAccount = Account("Address Book", addressBookAccountType)
assertTrue(accountManager.addAccountExplicitly(addressBookAccount, null, null))
try {
// address book has account + URL
val url = "https://example.com/address-book"
accountManager.setAndVerifyUserData(addressBookAccount, "real_account_name", account.name)
accountManager.setAndVerifyUserData(addressBookAccount, localAddressBookUserDataUrl, url)
// and is known in database
db.serviceDao().insertOrReplace(
Service(
id = 1, accountName = account.name, type = Service.TYPE_CARDDAV, principal = null
)
)
db.collectionDao().insert(
Collection(
id = 100,
serviceId = 1,
url = url.toHttpUrl(),
type = Collection.TYPE_ADDRESSBOOK,
displayName = "Some Address Book"
)
)
// run migration
migration.migrate(account)
// migration renames address book, update account
addressBookAccount = accountManager.getAccountsByType(addressBookAccountType).filter {
accountManager.getUserData(it, localAddressBookUserDataUrl) == url
}.first()
assertEquals("Some Address Book (${account.name}) #100", addressBookAccount.name)
// ID is now assigned
assertEquals(100L, accountManager.getUserData(addressBookAccount, LocalAddressBook.USER_DATA_COLLECTION_ID)?.toLong())
} finally {
accountManager.removeAccountExplicitly(addressBookAccount)
}
}
}
}

View file

@ -0,0 +1,122 @@
/*
* Copyright © All Contributors. See LICENSE and AUTHORS in the root directory for details.
*/
package at.bitfire.davdroid.settings.migration
import android.accounts.Account
import android.accounts.AccountManager
import android.content.Context
import at.bitfire.davdroid.R
import at.bitfire.davdroid.db.AppDatabase
import at.bitfire.davdroid.db.Collection
import at.bitfire.davdroid.db.Service
import at.bitfire.davdroid.resource.LocalAddressBook
import dagger.hilt.android.qualifiers.ApplicationContext
import dagger.hilt.android.testing.HiltAndroidRule
import dagger.hilt.android.testing.HiltAndroidTest
import io.mockk.every
import io.mockk.junit4.MockKRule
import io.mockk.mockkObject
import io.mockk.verify
import okhttp3.HttpUrl.Companion.toHttpUrl
import org.junit.Before
import org.junit.Rule
import org.junit.Test
import javax.inject.Inject
@HiltAndroidTest
class AccountSettingsMigration18Test {
@Inject @ApplicationContext
lateinit var context: Context
@Inject
lateinit var db: AppDatabase
@Inject
lateinit var migration: AccountSettingsMigration18
@get:Rule
val hiltRule = HiltAndroidRule(this)
@get:Rule
val mockkRule = MockKRule(this)
@Before
fun setUp() {
hiltRule.inject()
}
@Test
fun testMigrate_AddressBook_InvalidCollection() {
val addressBookAccountType = context.getString(R.string.account_type_address_book)
var addressBookAccount = Account("Address Book", addressBookAccountType)
val accountManager = AccountManager.get(context)
mockkObject(accountManager)
every { accountManager.getAccountsByType(addressBookAccountType) } returns arrayOf(addressBookAccount)
every { accountManager.getUserData(addressBookAccount, LocalAddressBook.USER_DATA_COLLECTION_ID) } returns "123"
val account = Account("test", "test")
migration.migrate(account)
verify(exactly = 0) {
accountManager.setUserData(addressBookAccount, any(), any())
}
}
@Test
fun testMigrate_AddressBook_NoCollection() {
val addressBookAccountType = context.getString(R.string.account_type_address_book)
var addressBookAccount = Account("Address Book", addressBookAccountType)
val accountManager = AccountManager.get(context)
mockkObject(accountManager)
every { accountManager.getAccountsByType(addressBookAccountType) } returns arrayOf(addressBookAccount)
every { accountManager.getUserData(addressBookAccount, LocalAddressBook.USER_DATA_COLLECTION_ID) } returns "123"
val account = Account("test", "test")
migration.migrate(account)
verify(exactly = 0) {
accountManager.setUserData(addressBookAccount, any(), any())
}
}
@Test
fun testMigrate_AddressBook_ValidCollection() {
val account = Account("test", "test")
db.serviceDao().insertOrReplace(Service(
id = 10,
accountName = account.name,
type = Service.TYPE_CARDDAV,
principal = null
))
db.collectionDao().insertOrUpdateByUrl(Collection(
id = 100,
serviceId = 10,
url = "http://example.com".toHttpUrl(),
type = Collection.TYPE_ADDRESSBOOK
))
val addressBookAccountType = context.getString(R.string.account_type_address_book)
var addressBookAccount = Account("Address Book", addressBookAccountType)
val accountManager = AccountManager.get(context)
mockkObject(accountManager)
every { accountManager.getAccountsByType(addressBookAccountType) } returns arrayOf(addressBookAccount)
every { accountManager.getUserData(addressBookAccount, LocalAddressBook.USER_DATA_COLLECTION_ID) } returns "100"
migration.migrate(account)
verify {
accountManager.setUserData(addressBookAccount, LocalAddressBook.USER_DATA_ACCOUNT_NAME, account.name)
accountManager.setUserData(addressBookAccount, LocalAddressBook.USER_DATA_ACCOUNT_TYPE, account.type)
}
}
}

View file

@ -0,0 +1,83 @@
/*
* Copyright © All Contributors. See LICENSE and AUTHORS in the root directory for details.
*/
package at.bitfire.davdroid.settings.migration
import android.accounts.Account
import android.content.Context
import android.util.Log
import androidx.hilt.work.HiltWorkerFactory
import androidx.work.Configuration
import androidx.work.WorkManager
import androidx.work.testing.WorkManagerTestInitHelper
import at.bitfire.davdroid.sync.AutomaticSyncManager
import dagger.hilt.android.qualifiers.ApplicationContext
import dagger.hilt.android.testing.BindValue
import dagger.hilt.android.testing.HiltAndroidRule
import dagger.hilt.android.testing.HiltAndroidTest
import io.mockk.impl.annotations.RelaxedMockK
import io.mockk.junit4.MockKRule
import io.mockk.mockkObject
import io.mockk.verify
import org.junit.Before
import org.junit.Rule
import org.junit.Test
import javax.inject.Inject
@HiltAndroidTest
class AccountSettingsMigration19Test {
@Inject @ApplicationContext
lateinit var context: Context
@BindValue
@RelaxedMockK
lateinit var automaticSyncManager: AutomaticSyncManager
@Inject
lateinit var migration: AccountSettingsMigration19
@Inject
lateinit var workerFactory: HiltWorkerFactory
@get:Rule
val hiltRule = HiltAndroidRule(this)
@get:Rule
val mockkRule = MockKRule(this)
@Before
fun setUp() {
hiltRule.inject()
// Initialize WorkManager for instrumentation tests.
val config = Configuration.Builder()
.setMinimumLoggingLevel(Log.DEBUG)
.setWorkerFactory(workerFactory)
.build()
WorkManagerTestInitHelper.initializeTestWorkManager(context, config)
}
@Test
fun testMigrate_CancelsOldWorkersAndUpdatesAutomaticSync() {
val workManager = WorkManager.getInstance(context)
mockkObject(workManager)
val account = Account("Some", "Test")
migration.migrate(account)
verify {
workManager.cancelUniqueWork("periodic-sync at.bitfire.davdroid.addressbooks Test/Some")
workManager.cancelUniqueWork("periodic-sync com.android.calendar Test/Some")
workManager.cancelUniqueWork("periodic-sync at.techbee.jtx.provider Test/Some")
workManager.cancelUniqueWork("periodic-sync org.dmfs.tasks Test/Some")
workManager.cancelUniqueWork("periodic-sync org.tasks.opentasks Test/Some")
automaticSyncManager.updateAutomaticSync(account)
}
}
}

View file

@ -0,0 +1,144 @@
/*
* Copyright © All Contributors. See LICENSE and AUTHORS in the root directory for details.
*/
package at.bitfire.davdroid.settings.migration
import android.Manifest
import android.accounts.Account
import android.accounts.AccountManager
import android.content.Context
import android.provider.CalendarContract
import android.provider.CalendarContract.Calendars
import androidx.core.content.contentValuesOf
import androidx.core.database.getLongOrNull
import androidx.test.rule.GrantPermissionRule
import at.bitfire.davdroid.db.AppDatabase
import at.bitfire.davdroid.db.Collection
import at.bitfire.davdroid.db.Service
import at.bitfire.davdroid.resource.LocalAddressBook
import at.bitfire.davdroid.resource.LocalCalendarStore
import at.bitfire.davdroid.resource.LocalTestAddressBookProvider
import at.bitfire.davdroid.sync.account.setAndVerifyUserData
import at.bitfire.ical4android.util.MiscUtils.asSyncAdapter
import at.bitfire.vcard4android.GroupMethod
import dagger.hilt.android.qualifiers.ApplicationContext
import dagger.hilt.android.testing.HiltAndroidRule
import dagger.hilt.android.testing.HiltAndroidTest
import io.mockk.junit4.MockKRule
import io.mockk.mockk
import okhttp3.HttpUrl.Companion.toHttpUrl
import org.junit.Assert.assertEquals
import org.junit.Before
import org.junit.Rule
import org.junit.Test
import javax.inject.Inject
@HiltAndroidTest
class AccountSettingsMigration20Test {
@Inject
lateinit var calendarStore: LocalCalendarStore
@Inject @ApplicationContext
lateinit var context: Context
@Inject
lateinit var db: AppDatabase
@Inject
lateinit var migration: AccountSettingsMigration20
@Inject
lateinit var localTestAddressBookProvider: LocalTestAddressBookProvider
@get:Rule
val hiltRule = HiltAndroidRule(this)
@get:Rule
val mockkRule = MockKRule(this)
@get:Rule
val permissionsRule = GrantPermissionRule.grant(
Manifest.permission.READ_CONTACTS, Manifest.permission.WRITE_CONTACTS,
Manifest.permission.READ_CALENDAR, Manifest.permission.WRITE_CALENDAR
)
val accountManager by lazy { AccountManager.get(context) }
@Before
fun setUp() {
hiltRule.inject()
}
@Test
fun testMigrateAddressBooks_UrlMatchesCollection() {
// set up legacy address-book with URL, but without collection ID
val account = Account("test", "test")
val url = "https://example.com/"
db.serviceDao().insertOrReplace(Service(id = 1, accountName = account.name, type = Service.TYPE_CARDDAV, principal = null))
val collectionId = db.collectionDao().insert(Collection(
serviceId = 1,
type = Collection.Companion.TYPE_ADDRESSBOOK,
url = url.toHttpUrl()
))
localTestAddressBookProvider.provide(account, mockk(relaxed = true), GroupMethod.GROUP_VCARDS) { addressBook ->
accountManager.setAndVerifyUserData(addressBook.addressBookAccount, LocalAddressBook.USER_DATA_ACCOUNT_NAME, account.name)
accountManager.setAndVerifyUserData(addressBook.addressBookAccount, LocalAddressBook.USER_DATA_ACCOUNT_TYPE, account.type)
accountManager.setAndVerifyUserData(addressBook.addressBookAccount, AccountSettingsMigration20.ADDRESS_BOOK_USER_DATA_URL, url)
accountManager.setAndVerifyUserData(addressBook.addressBookAccount, LocalAddressBook.USER_DATA_COLLECTION_ID, null)
migration.migrateAddressBooks(account, cardDavServiceId = 1)
assertEquals(
collectionId,
accountManager.getUserData(addressBook.addressBookAccount, LocalAddressBook.USER_DATA_COLLECTION_ID).toLongOrNull()
)
}
}
@Test
fun testMigrateCalendars_UrlMatchesCollection() {
// set up legacy calendar with URL, but without collection ID
val account = Account("test", CalendarContract.ACCOUNT_TYPE_LOCAL)
val url = "https://example.com/"
db.serviceDao().insertOrReplace(Service(id = 1, accountName = account.name, type = Service.TYPE_CALDAV, principal = null))
val collectionId = db.collectionDao().insert(
Collection(
serviceId = 1,
type = Collection.Companion.TYPE_CALENDAR,
url = url.toHttpUrl()
)
)
context.contentResolver.acquireContentProviderClient(CalendarContract.AUTHORITY)!!.use { provider ->
val uri = provider.insert(
Calendars.CONTENT_URI.asSyncAdapter(account),
contentValuesOf(
Calendars.ACCOUNT_NAME to account.name,
Calendars.ACCOUNT_TYPE to account.type,
Calendars.CALENDAR_DISPLAY_NAME to "Test",
Calendars.NAME to url,
Calendars.SYNC_EVENTS to 1
)
)!!.asSyncAdapter(account)
try {
migration.migrateCalendars(account, 1)
provider.query(uri, arrayOf(Calendars._SYNC_ID), null, null, null)!!.use { cursor ->
cursor.moveToNext()
assertEquals(collectionId, cursor.getLongOrNull(0))
}
} finally {
provider.delete(uri, null, null)
}
}
}
}

View file

@ -0,0 +1,238 @@
/*
* Copyright © All Contributors. See LICENSE and AUTHORS in the root directory for details.
*/
package at.bitfire.davdroid.sync
import android.accounts.Account
import android.content.ContentResolver
import android.content.SyncRequest
import android.content.SyncStatusObserver
import android.os.Bundle
import android.provider.CalendarContract
import androidx.test.filters.SdkSuppress
import at.bitfire.davdroid.sync.account.TestAccount
import dagger.hilt.android.testing.HiltAndroidRule
import dagger.hilt.android.testing.HiltAndroidTest
import kotlinx.coroutines.delay
import kotlinx.coroutines.runBlocking
import kotlinx.coroutines.withTimeout
import org.junit.After
import org.junit.AfterClass
import org.junit.Assert.assertTrue
import org.junit.Before
import org.junit.BeforeClass
import org.junit.Rule
import org.junit.Test
import java.util.Collections
import java.util.LinkedList
import java.util.logging.Logger
import javax.inject.Inject
import kotlin.time.Duration.Companion.seconds
@HiltAndroidTest
class AndroidSyncFrameworkTest: SyncStatusObserver {
@get:Rule
val hiltRule = HiltAndroidRule(this)
@Inject
lateinit var logger: Logger
lateinit var account: Account
val authority = CalendarContract.AUTHORITY
private lateinit var stateChangeListener: Any
private val recordedStates = Collections.synchronizedList(LinkedList<State>())
@Before
fun setUp() {
hiltRule.inject()
account = TestAccount.create()
// Enable sync globally and for the test account
ContentResolver.setIsSyncable(account, authority, 1)
// Remember states the sync framework reports as pairs of (sync pending, sync active).
recordedStates.clear()
onStatusChanged(0) // record first entry (pending = false, active = false)
stateChangeListener = ContentResolver.addStatusChangeListener(
ContentResolver.SYNC_OBSERVER_TYPE_PENDING or ContentResolver.SYNC_OBSERVER_TYPE_ACTIVE,
this
)
}
@After
fun tearDown() {
ContentResolver.removeStatusChangeListener(stateChangeListener)
TestAccount.remove(account)
}
/**
* Correct behaviour of the sync framework on Android 13 and below.
* Pending state is correctly reflected
*/
@SdkSuppress(maxSdkVersion = 33)
@Test
fun testVerifySyncAlwaysPending_correctBehaviour_android13() {
verifySyncStates(
listOf(
State(pending = false, active = false), // no sync pending or active
State(pending = true, active = false, optional = true), // sync becomes pending
State(pending = true, active = true), // ... and pending and active at the same time
State(pending = false, active = true), // ... and then only active
State(pending = false, active = false) // sync finished
)
)
}
/**
* Wrong behaviour of the sync framework on Android 14+.
* Pending state stays true forever (after initial run), active state behaves correctly
*/
@SdkSuppress(minSdkVersion = 34 /*, maxSdkVersion = 36 */)
@Test
fun testVerifySyncAlwaysPending_wrongBehaviour_android14() {
verifySyncStates(
listOf(
State(pending = false, active = false), // no sync pending or active
State(pending = true, active = false, optional = true), // sync becomes pending
State(pending = true, active = true), // ... and pending and active at the same time
State(pending = true, active = false) // ... and finishes, but stays pending
)
)
}
// helpers
private fun syncRequest() = SyncRequest.Builder()
.setSyncAdapter(account, authority)
.syncOnce()
.setExtras(Bundle()) // needed for Android 9
.setExpedited(true) // sync request will be scheduled at the front of the sync request queue
.setManual(true) // equivalent of setting both SYNC_EXTRAS_IGNORE_SETTINGS and SYNC_EXTRAS_IGNORE_BACKOFF
.build()
/**
* Verifies that the given expected states match the recorded states.
*/
private fun verifySyncStates(expectedStates: List<State>) = runBlocking {
// Verify that last state is non-optional.
if (expectedStates.last().optional)
throw IllegalArgumentException("Last expected state must not be optional")
// We use runBlocking for these tests because it uses the default dispatcher
// which does not auto-advance virtual time and we need real system time to
// test the sync framework behavior.
ContentResolver.requestSync(syncRequest())
// Even though the always-pending-bug is present on Android 14+, the sync active
// state behaves correctly, so we can record the state changes as pairs (pending,
// active) and expect a certain sequence of state pairs to verify the presence or
// absence of the bug on different Android versions.
withTimeout(60.seconds) { // Usually takes less than 30 seconds
while (recordedStates.size < expectedStates.size) {
// verify already known states
if (recordedStates.isNotEmpty())
assertStatesEqual(expectedStates, recordedStates, fullMatch = false)
delay(500) // avoid busy-waiting
}
assertStatesEqual(expectedStates, recordedStates, fullMatch = true)
}
}
private fun assertStatesEqual(expectedStates: List<State>, actualStates: List<State>, fullMatch: Boolean) {
assertTrue("Expected states=$expectedStates, actual=$actualStates", statesMatch(expectedStates, actualStates, fullMatch))
}
/**
* Checks whether [actualStates] have matching [expectedStates], under the condition
* that expected states with the [State.optional] flag can be skipped.
*
* Note: When [fullMatch] is not set, this method can return _true_ even if not all expected states are used.
*
* @param expectedStates expected states (can include optional states which don't have to be present in actual states)
* @param actualStates actual states
* @param fullMatch whether all non-optional expected states must be present in actual states
*/
private fun statesMatch(expectedStates: List<State>, actualStates: List<State>, fullMatch: Boolean): Boolean {
// iterate through entries
val expectedIterator = expectedStates.iterator()
for (actual in actualStates) {
if (!expectedIterator.hasNext())
return false
var expected = expectedIterator.next()
// skip optional expected entries if they don't match the actual entry
while (!actual.stateEquals(expected) && expected.optional) {
if (!expectedIterator.hasNext())
return false
expected = expectedIterator.next()
}
// we now have a non-optional expected state and it must match
if (!actual.stateEquals(expected))
return false
}
// full match: all expected states must have been used
if (fullMatch && expectedIterator.hasNext())
return false
return true
}
// SyncStatusObserver implementation and data class
override fun onStatusChanged(which: Int) {
val state = State(
pending = ContentResolver.isSyncPending(account, authority),
active = ContentResolver.isSyncActive(account, authority)
)
synchronized(recordedStates) {
if (recordedStates.lastOrNull() != state) {
logger.info("$account syncState = $state")
recordedStates += state
}
}
}
data class State(
val pending: Boolean,
val active: Boolean,
val optional: Boolean = false
) {
fun stateEquals(other: State) =
pending == other.pending && active == other.active
}
companion object {
var globalAutoSyncBeforeTest = false
@BeforeClass
@JvmStatic
fun before() {
globalAutoSyncBeforeTest = ContentResolver.getMasterSyncAutomatically()
// We'll request syncs explicitly and with SYNC_EXTRAS_IGNORE_SETTINGS
ContentResolver.setMasterSyncAutomatically(false)
}
@AfterClass
@JvmStatic
fun after() {
ContentResolver.setMasterSyncAutomatically(globalAutoSyncBeforeTest)
}
}
}

View file

@ -0,0 +1,51 @@
/*
* Copyright © All Contributors. See LICENSE and AUTHORS in the root directory for details.
*/
package at.bitfire.davdroid.sync
import android.accounts.Account
import android.content.AbstractThreadedSyncAdapter
import android.content.ContentProviderClient
import android.content.Context
import android.content.SyncResult
import android.os.Bundle
import android.os.IBinder
import at.bitfire.davdroid.sync.adapter.SyncAdapter
import dagger.hilt.android.qualifiers.ApplicationContext
import java.util.logging.Level
import java.util.logging.Logger
import javax.inject.Inject
class FakeSyncAdapter @Inject constructor(
@ApplicationContext context: Context,
private val logger: Logger
): AbstractThreadedSyncAdapter(context, true), SyncAdapter {
init {
logger.info("FakeSyncAdapter created")
}
override fun onPerformSync(account: Account, extras: Bundle, authority: String, provider: ContentProviderClient, syncResult: SyncResult) {
logger.log(
Level.INFO,
"onPerformSync(account=$account, extras=$extras, authority=$authority, syncResult=$syncResult)",
extras.keySet().map { key -> "extras[$key] = ${extras[key]}" }
)
// fake 5 sec sync
try {
Thread.sleep(5000)
} catch (_: InterruptedException) {
logger.info("onPerformSync($account) cancelled")
}
logger.info("onPerformSync($account) finished")
}
// SyncAdapter implementation and Hilt module
override fun getBinder(): IBinder = syncAdapterBinder
}

View file

@ -0,0 +1,187 @@
/*
* Copyright © All Contributors. See LICENSE and AUTHORS in the root directory for details.
*/
package at.bitfire.davdroid.sync
import android.accounts.Account
import android.content.ContentProviderClient
import android.content.Context
import at.bitfire.davdroid.db.Collection
import at.bitfire.davdroid.db.Service
import at.bitfire.davdroid.network.HttpClient
import at.bitfire.davdroid.repository.DavServiceRepository
import at.bitfire.davdroid.resource.LocalJtxCollection
import at.bitfire.davdroid.resource.LocalJtxCollectionStore
import at.bitfire.davdroid.sync.account.TestAccount
import at.bitfire.davdroid.util.PermissionUtils
import at.bitfire.ical4android.TaskProvider
import at.bitfire.ical4android.util.MiscUtils.closeCompat
import at.bitfire.synctools.test.GrantPermissionOrSkipRule
import at.techbee.jtx.JtxContract
import dagger.hilt.android.qualifiers.ApplicationContext
import dagger.hilt.android.testing.HiltAndroidRule
import dagger.hilt.android.testing.HiltAndroidTest
import okhttp3.HttpUrl.Companion.toHttpUrl
import org.junit.After
import org.junit.Assert.assertEquals
import org.junit.Assume.assumeNotNull
import org.junit.Assume.assumeTrue
import org.junit.Before
import org.junit.Rule
import org.junit.Test
import java.io.StringReader
import javax.inject.Inject
/**
* Ensure you have jtxBoard installed on the emulator, before running these tests. Otherwise they
* will be skipped.
*/
@HiltAndroidTest
class JtxSyncManagerTest {
@Inject
@ApplicationContext
lateinit var context: Context
@Inject
lateinit var httpClientBuilder: HttpClient.Builder
@Inject
lateinit var serviceRepository: DavServiceRepository
@Inject
lateinit var localJtxCollectionStore: LocalJtxCollectionStore
@Inject
lateinit var jtxSyncManagerFactory: JtxSyncManager.Factory
@get:Rule
val hiltRule = HiltAndroidRule(this)
@get:Rule
val permissionRule = GrantPermissionOrSkipRule(TaskProvider.PERMISSIONS_JTX.toSet())
lateinit var account: Account
private lateinit var provider: ContentProviderClient
private lateinit var syncManager: JtxSyncManager
private lateinit var localJtxCollection: LocalJtxCollection
@Before
fun setUp() {
hiltRule.inject()
// Check jtxBoard permissions were granted (+jtxBoard is installed); skip test otherwise
assumeTrue(PermissionUtils.havePermissions(context, TaskProvider.PERMISSIONS_JTX))
// Acquire the jtx content provider
val providerOrNull = context.contentResolver.acquireContentProviderClient(JtxContract.AUTHORITY)
assumeNotNull(providerOrNull)
provider = providerOrNull!!
account = TestAccount.create()
// Create dummy dependencies
val service = Service(0, account.name, Service.TYPE_CALDAV, null)
val serviceId = serviceRepository.insertOrReplaceBlocking(service)
val dbCollection = Collection(
0,
serviceId,
type = Collection.TYPE_CALENDAR,
url = "https://example.com".toHttpUrl()
)
localJtxCollection = localJtxCollectionStore.create(provider, dbCollection)!!
syncManager = jtxSyncManagerFactory.jtxSyncManager(
account = account,
httpClient = httpClientBuilder.build(),
syncResult = SyncResult(),
localCollection = localJtxCollection,
collection = dbCollection,
resync = null
)
}
@After
fun tearDown() {
if (this::localJtxCollection.isInitialized)
localJtxCollectionStore.delete(localJtxCollection)
serviceRepository.deleteAllBlocking()
if (this::provider.isInitialized)
provider.closeCompat()
if (this::account.isInitialized)
TestAccount.remove(account)
}
@Test
fun testProcessICalObject_addsVtodo() {
val calendar = "BEGIN:VCALENDAR\n" +
"PRODID:-Vivaldi Calendar V1.0//EN\n" +
"VERSION:2.0\n" +
"BEGIN:VTODO\n" +
"SUMMARY:Test Task (Main VTODO)\n" +
"DTSTAMP;VALUE=DATE-TIME:20250228T032800Z\n" +
"UID:47a23c66-8c1a-4b44-bbe8-ebf33f8cf80f\n" +
"END:VTODO\n" +
"END:VCALENDAR"
// Should create "demo-calendar"
syncManager.processICalObject("demo-calendar", "abc123", StringReader(calendar))
// Verify main VTODO is created
val localJtxIcalObject = localJtxCollection.findByName("demo-calendar")!!
assertEquals("47a23c66-8c1a-4b44-bbe8-ebf33f8cf80f", localJtxIcalObject.uid)
assertEquals("abc123", localJtxIcalObject.eTag)
assertEquals("Test Task (Main VTODO)", localJtxIcalObject.summary)
}
@Test
fun testProcessICalObject_addsRecurringVtodo_withoutDtStart() {
// Valid calendar example (See bitfireAT/davx5-ose#1265)
// Note: We don't support starting a recurrence from DUE (RFC 5545 leaves it open to interpretation)
val calendar = "BEGIN:VCALENDAR\n" +
"PRODID:-Vivaldi Calendar V1.0//EN\n" +
"VERSION:2.0\n" +
"BEGIN:VTODO\n" +
"SUMMARY:Test Task (Exception)\n" +
"DTSTAMP;VALUE=DATE-TIME:20250228T032800Z\n" +
"DUE;TZID=America/New_York:20250228T130000\n" +
"RECURRENCE-ID;TZID=America/New_York:20250228T130000\n" +
"UID:47a23c66-8c1a-4b44-bbe8-ebf33f8cf80f\n" +
"END:VTODO\n" +
"BEGIN:VTODO\n" +
"SUMMARY:Test Task (Main VTODO)\n" +
"DTSTAMP;VALUE=DATE-TIME:20250228T032800Z\n" +
"DUE;TZID=America/New_York:20250228T130000\n" + // Due date will NOT be assumed as start for recurrence
"SEQUENCE:1\n" +
"UID:47a23c66-8c1a-4b44-bbe8-ebf33f8cf80f\n" +
"RRULE:FREQ=WEEKLY;INTERVAL=1;BYDAY=FR;UNTIL=20250505T235959Z\n" +
"END:VTODO\n" +
"END:VCALENDAR"
// Create and store calendar
syncManager.processICalObject("demo-calendar", "abc123", StringReader(calendar))
// Verify main VTODO was created with RRULE present
val mainVtodo = localJtxCollection.findByName("demo-calendar")!!
assertEquals("Test Task (Main VTODO)", mainVtodo.summary)
assertEquals("FREQ=WEEKLY;UNTIL=20250505T235959Z;INTERVAL=1;BYDAY=FR", mainVtodo.rrule)
// Verify the RRULE exception instance was created with correct recurrence-id timezone
val vtodoException = localJtxCollection.findRecurInstance(
uid = "47a23c66-8c1a-4b44-bbe8-ebf33f8cf80f",
recurid = "20250228T130000"
)!!
assertEquals("Test Task (Exception)", vtodoException.summary)
assertEquals("America/New_York", vtodoException.recuridTimezone)
}
}

View file

@ -0,0 +1,47 @@
/*
* Copyright © All Contributors. See LICENSE and AUTHORS in the root directory for details.
*/
package at.bitfire.davdroid.sync
import at.bitfire.davdroid.resource.LocalCollection
import at.bitfire.davdroid.resource.SyncState
class LocalTestCollection(
override val dbCollectionId: Long = 0L
): LocalCollection<LocalTestResource> {
override val tag = "LocalTestCollection"
override val title = "Local Test Collection"
override var lastSyncState: SyncState? = null
val entries = mutableListOf<LocalTestResource>()
override val readOnly: Boolean
get() = throw NotImplementedError()
override fun findDeleted() = entries.filter { it.deleted }
override fun findDirty() = entries.filter { it.dirty }
override fun findByName(name: String) = entries.firstOrNull { it.fileName == name }
override fun markNotDirty(flags: Int): Int {
var updated = 0
for (dirty in findDirty()) {
dirty.flags = flags
updated++
}
return updated
}
override fun removeNotDirtyMarked(flags: Int): Int {
val numBefore = entries.size
entries.removeIf { !it.dirty && it.flags == flags }
return numBefore - entries.size
}
override fun forgetETags() {
}
}

View file

@ -0,0 +1,44 @@
/*
* Copyright © All Contributors. See LICENSE and AUTHORS in the root directory for details.
*/
package at.bitfire.davdroid.sync
import android.content.Context
import at.bitfire.davdroid.resource.LocalResource
import java.util.Optional
class LocalTestResource: LocalResource<Any> {
override val id: Long? = null
override var fileName: String? = null
override var eTag: String? = null
override var scheduleTag: String? = null
override var flags: Int = 0
var deleted = false
var dirty = false
override fun prepareForUpload() = "generated-file.txt"
override fun clearDirty(fileName: Optional<String>, eTag: String?, scheduleTag: String?) {
dirty = false
if (fileName.isPresent)
this.fileName = fileName.get()
this.eTag = eTag
this.scheduleTag = scheduleTag
}
override fun updateFlags(flags: Int) {
this.flags = flags
}
override fun update(data: Any, fileName: String?, eTag: String?, scheduleTag: String?, flags: Int) = throw NotImplementedError()
override fun deleteLocal() = throw NotImplementedError()
override fun resetDeleted() = throw NotImplementedError()
override fun getDebugSummary() = "Test Resource"
override fun getViewUri(context: Context) = null
}

View file

@ -0,0 +1,158 @@
/*
* Copyright © All Contributors. See LICENSE and AUTHORS in the root directory for details.
*/
package at.bitfire.davdroid.sync
import android.accounts.Account
import android.content.ContentResolver
import android.content.Context
import android.content.SyncResult
import android.os.Bundle
import android.provider.CalendarContract
import androidx.hilt.work.HiltWorkerFactory
import androidx.work.WorkInfo
import androidx.work.WorkManager
import at.bitfire.davdroid.TestUtils
import at.bitfire.davdroid.sync.account.TestAccount
import at.bitfire.davdroid.sync.adapter.SyncAdapterImpl
import at.bitfire.davdroid.sync.worker.SyncWorkerManager
import dagger.hilt.android.qualifiers.ApplicationContext
import dagger.hilt.android.testing.BindValue
import dagger.hilt.android.testing.HiltAndroidRule
import dagger.hilt.android.testing.HiltAndroidTest
import io.mockk.Awaits
import io.mockk.coEvery
import io.mockk.every
import io.mockk.impl.annotations.MockK
import io.mockk.junit4.MockKRule
import io.mockk.just
import io.mockk.mockk
import io.mockk.mockkObject
import io.mockk.mockkStatic
import kotlinx.coroutines.CoroutineScope
import kotlinx.coroutines.delay
import kotlinx.coroutines.flow.flow
import kotlinx.coroutines.launch
import kotlinx.coroutines.test.runTest
import kotlinx.coroutines.withTimeout
import org.junit.After
import org.junit.Before
import org.junit.Rule
import org.junit.Test
import javax.inject.Inject
import javax.inject.Provider
import kotlin.coroutines.cancellation.CancellationException
@HiltAndroidTest
class SyncAdapterImplTest {
@get:Rule
val hiltRule = HiltAndroidRule(this)
@get:Rule
val mockkRule = MockKRule(this)
@Inject @ApplicationContext
lateinit var context: Context
@Inject
lateinit var syncAdapterImplProvider: Provider<SyncAdapterImpl>
@BindValue @MockK
lateinit var syncWorkerManager: SyncWorkerManager
@Inject
lateinit var workerFactory: HiltWorkerFactory
lateinit var account: Account
private var masterSyncStateBeforeTest = ContentResolver.getMasterSyncAutomatically()
@Before
fun setUp() {
hiltRule.inject()
TestUtils.setUpWorkManager(context, workerFactory)
account = TestAccount.create()
ContentResolver.setMasterSyncAutomatically(true)
ContentResolver.setSyncAutomatically(account, CalendarContract.AUTHORITY, true)
ContentResolver.setIsSyncable(account, CalendarContract.AUTHORITY, 1)
}
@After
fun tearDown() {
ContentResolver.setMasterSyncAutomatically(masterSyncStateBeforeTest)
TestAccount.remove(account)
}
@Test
fun testSyncAdapter_onPerformSync_cancellation() = runTest {
val workManager = WorkManager.getInstance(context)
val syncAdapter = syncAdapterImplProvider.get()
mockkObject(workManager) {
// don't actually create a worker
every { syncWorkerManager.enqueueOneTime(any(), any()) } returns "TheSyncWorker"
// assume worker takes a long time
every { workManager.getWorkInfosForUniqueWorkFlow("TheSyncWorker") } just Awaits
val sync = launch {
syncAdapter.onPerformSync(account, Bundle(), CalendarContract.AUTHORITY, mockk(), SyncResult())
}
// simulate incoming cancellation from sync framework
syncAdapter.onSyncCanceled()
// wait for sync to finish (should happen immediately)
sync.join()
}
}
@Test
fun testSyncAdapter_onPerformSync_returnsAfterTimeout() {
val workManager = WorkManager.getInstance(context)
val syncAdapter = syncAdapterImplProvider.get()
mockkObject(workManager) {
// don't actually create a worker
every { syncWorkerManager.enqueueOneTime(any(), any()) } returns "TheSyncWorker"
// assume worker takes a long time
every { workManager.getWorkInfosForUniqueWorkFlow("TheSyncWorker") } just Awaits
mockkStatic("kotlinx.coroutines.TimeoutKt") { // mock global extension function
// immediate timeout (instead of really waiting)
coEvery { withTimeout(any<Long>(), any<suspend CoroutineScope.() -> Unit>()) } throws CancellationException("Simulated timeout")
syncAdapter.onPerformSync(account, Bundle(), CalendarContract.AUTHORITY, mockk(), SyncResult())
}
}
}
@Test
fun testSyncAdapter_onPerformSync_runsInTime() {
val workManager = WorkManager.getInstance(context)
val syncAdapter = syncAdapterImplProvider.get()
mockkObject(workManager) {
// don't actually create a worker
every { syncWorkerManager.enqueueOneTime(any(), any()) } returns "TheSyncWorker"
// assume worker immediately returns with success
val success = mockk<WorkInfo>()
every { success.state } returns WorkInfo.State.SUCCEEDED
every { workManager.getWorkInfosForUniqueWorkFlow("TheSyncWorker") } returns flow {
emit(listOf(success))
delay(60000) // keep the flow active
}
// should just run
syncAdapter.onPerformSync(account, Bundle(), CalendarContract.AUTHORITY, mockk(), SyncResult())
}
}
}

View file

@ -0,0 +1,503 @@
/*
* Copyright © All Contributors. See LICENSE and AUTHORS in the root directory for details.
*/
package at.bitfire.davdroid.sync
import android.accounts.Account
import android.content.Context
import androidx.core.app.NotificationManagerCompat
import androidx.hilt.work.HiltWorkerFactory
import at.bitfire.dav4jvm.PropStat
import at.bitfire.dav4jvm.Response
import at.bitfire.dav4jvm.Response.HrefRelation
import at.bitfire.dav4jvm.property.webdav.GetETag
import at.bitfire.davdroid.TestUtils
import at.bitfire.davdroid.TestUtils.assertWithin
import at.bitfire.davdroid.db.Collection
import at.bitfire.davdroid.network.HttpClient
import at.bitfire.davdroid.repository.DavSyncStatsRepository
import at.bitfire.davdroid.resource.SyncState
import at.bitfire.davdroid.settings.AccountSettings
import at.bitfire.davdroid.sync.account.TestAccount
import dagger.hilt.android.qualifiers.ApplicationContext
import dagger.hilt.android.testing.BindValue
import dagger.hilt.android.testing.HiltAndroidRule
import dagger.hilt.android.testing.HiltAndroidTest
import io.mockk.every
import io.mockk.impl.annotations.RelaxedMockK
import io.mockk.junit4.MockKRule
import io.mockk.mockk
import kotlinx.coroutines.test.runTest
import okhttp3.Protocol
import okhttp3.internal.http.StatusLine
import okhttp3.mockwebserver.MockResponse
import okhttp3.mockwebserver.MockWebServer
import org.junit.After
import org.junit.Assert.assertEquals
import org.junit.Assert.assertFalse
import org.junit.Assert.assertTrue
import org.junit.Before
import org.junit.Rule
import org.junit.Test
import java.time.Instant
import javax.inject.Inject
@HiltAndroidTest
class SyncManagerTest {
@get:Rule
val hiltRule = HiltAndroidRule(this)
@get:Rule
val mockKRule = MockKRule(this)
@Inject
lateinit var accountSettingsFactory: AccountSettings.Factory
@Inject @ApplicationContext
lateinit var context: Context
@Inject
lateinit var httpClientBuilder: HttpClient.Builder
@Inject
lateinit var syncManagerFactory: TestSyncManager.Factory
@BindValue
@RelaxedMockK
lateinit var syncStatsRepository: DavSyncStatsRepository
@Inject
lateinit var workerFactory: HiltWorkerFactory
private lateinit var account: Account
private lateinit var server: MockWebServer
@Before
fun setUp() {
hiltRule.inject()
TestUtils.setUpWorkManager(context, workerFactory)
account = TestAccount.create()
server = MockWebServer().apply {
start()
}
}
@After
fun tearDown() {
TestAccount.remove(account)
// clear annoying syncError notifications
NotificationManagerCompat.from(context).cancelAll()
server.close()
}
private fun queryCapabilitiesResponse(cTag: String? = null): MockResponse {
val body = StringBuilder()
body.append(
"<?xml version=\"1.0\" encoding=\"utf-8\" ?>\n" +
"<multistatus xmlns=\"DAV:\" xmlns:CALDAV=\"http://calendarserver.org/ns/\">\n" +
" <response>\n" +
" <href>/</href>\n" +
" <propstat>\n" +
" <prop>\n"
)
if (cTag != null)
body.append("<CALDAV:getctag>$cTag</CALDAV:getctag>\n")
body.append(
" </prop>\n" +
" </propstat>\n" +
" </response>\n" +
"</multistatus>"
)
return MockResponse()
.setResponseCode(207)
.setHeader("Content-Type", "text/xml")
.setBody(body.toString())
}
@Test
fun testPerformSync_503RetryAfter_DelaySeconds() = runTest {
server.enqueue(MockResponse()
.setResponseCode(503)
.setHeader("Retry-After", "60")) // 60 seconds
val result = SyncResult()
val syncManager = syncManager(LocalTestCollection(), result)
syncManager.performSync()
val expected = Instant.now()
.plusSeconds(60)
.toEpochMilli()
// 5 sec tolerance for test
assertWithin(expected, result.delayUntil*1000, 5000)
}
@Test
fun testPerformSync_FirstSync_Empty() = runTest {
val collection = LocalTestCollection() /* no last known ctag */
server.enqueue(queryCapabilitiesResponse())
val syncManager = syncManager(collection)
syncManager.performSync()
assertFalse(syncManager.didGenerateUpload)
assertTrue(syncManager.didListAllRemote)
assertFalse(syncManager.didDownloadRemote)
assertFalse(syncManager.syncResult.hasError())
assertTrue(collection.entries.isEmpty())
}
@Test
fun testPerformSync_UploadNewMember_ETagOnPut() = runTest {
val collection = LocalTestCollection().apply {
lastSyncState = SyncState(SyncState.Type.CTAG, "old-ctag")
entries += LocalTestResource().apply {
dirty = true
}
}
server.enqueue(queryCapabilitiesResponse("ctag1"))
// PUT -> 204 No Content
server.enqueue(MockResponse()
.setResponseCode(204)
.setHeader("ETag", "etag-from-put"))
// modifications sent, so DAVx5 will query CTag again
server.enqueue(queryCapabilitiesResponse("ctag2"))
val syncManager = syncManager(collection).apply {
listAllRemoteResult = listOf(
Pair(Response(
server.url("/"),
server.url("/generated-file.txt"),
null,
listOf(PropStat(
listOf(
GetETag("\"etag-from-put\"")
),
StatusLine(Protocol.HTTP_1_1, 200, "OK")
)
)), HrefRelation.MEMBER)
)
}
syncManager.performSync()
assertTrue(syncManager.didGenerateUpload)
assertTrue(syncManager.didListAllRemote)
assertFalse(syncManager.didDownloadRemote)
assertFalse(syncManager.syncResult.hasError())
assertEquals(1, collection.entries.size)
assertEquals("etag-from-put", collection.entries.first().eTag)
}
@Test
fun testPerformSync_UploadModifiedMember_ETagOnPut() = runTest {
val collection = LocalTestCollection().apply {
lastSyncState = SyncState(SyncState.Type.CTAG, "old-ctag")
entries += LocalTestResource().apply {
fileName = "existing-file.txt"
eTag = "old-etag-like-on-server"
dirty = true
}
}
server.enqueue(queryCapabilitiesResponse("ctag1"))
// PUT -> 204 No Content
server.enqueue(MockResponse()
.setResponseCode(204)
.addHeader("ETag", "etag-from-put"))
// modifications sent, so DAVx5 will query CTag again
server.enqueue(queryCapabilitiesResponse("ctag2"))
val syncManager = syncManager(collection).apply {
listAllRemoteResult = listOf(
Pair(Response(
server.url("/"),
server.url("/existing-file.txt"),
null,
listOf(PropStat(
listOf(
GetETag("etag-from-put")
),
StatusLine(Protocol.HTTP_1_1, 200, "OK")
)
)), HrefRelation.MEMBER)
)
assertDownloadRemote = mapOf(Pair(server.url("/existing-file.txt"), "etag-from-put"))
}
syncManager.performSync()
assertTrue(syncManager.didGenerateUpload)
assertTrue(syncManager.didListAllRemote)
assertFalse(syncManager.didDownloadRemote)
assertFalse(syncManager.syncResult.hasError())
assertEquals(1, collection.entries.size)
assertEquals("etag-from-put", collection.entries.first().eTag)
}
@Test
fun testPerformSync_UploadModifiedMember_NoETagOnPut() = runTest {
val collection = LocalTestCollection().apply {
lastSyncState = SyncState(SyncState.Type.CTAG, "old-ctag")
entries += LocalTestResource().apply {
fileName = "existing-file.txt"
eTag = "old-etag-like-on-server"
dirty = true
}
}
server.enqueue(queryCapabilitiesResponse("ctag1"))
// PUT -> 204 No Content
server.enqueue(MockResponse().setResponseCode(204))
// modifications sent, so DAVx5 will query CTag again
server.enqueue(queryCapabilitiesResponse("ctag2"))
val syncManager = syncManager(collection).apply {
listAllRemoteResult = listOf(
Pair(Response(
server.url("/"),
server.url("/existing-file.txt"),
null,
listOf(PropStat(
listOf(
GetETag("etag-from-propfind")
),
StatusLine(Protocol.HTTP_1_1, 200, "OK")
)
)), HrefRelation.MEMBER)
)
assertDownloadRemote = mapOf(Pair(server.url("/existing-file.txt"), "etag-from-propfind"))
}
syncManager.performSync()
assertTrue(syncManager.didGenerateUpload)
assertTrue(syncManager.didListAllRemote)
assertTrue(syncManager.didDownloadRemote)
assertFalse(syncManager.syncResult.hasError())
assertEquals(1, collection.entries.size)
assertEquals("etag-from-propfind", collection.entries.first().eTag)
}
@Test
fun testPerformSync_UploadModifiedMember_412PreconditionFailed() = runTest {
val collection = LocalTestCollection().apply {
lastSyncState = SyncState(SyncState.Type.CTAG, "old-ctag")
entries += LocalTestResource().apply {
fileName = "existing-file.txt"
eTag = "etag-that-has-been-changed-on-server-in-the-meanwhile"
dirty = true
}
}
server.enqueue(queryCapabilitiesResponse("ctag1"))
// PUT -> 412 Precondition Failed
server.enqueue(MockResponse()
.setResponseCode(412))
// modifications sent, so DAVx5 will query CTag again
server.enqueue(queryCapabilitiesResponse("ctag1"))
val syncManager = syncManager(collection).apply {
listAllRemoteResult = listOf(
Pair(Response(
server.url("/"),
server.url("/existing-file.txt"),
null,
listOf(PropStat(
listOf(
GetETag("changed-etag-from-server")
),
StatusLine(Protocol.HTTP_1_1, 200, "OK")
)
)), HrefRelation.MEMBER)
)
assertDownloadRemote = mapOf(Pair(server.url("/existing-file.txt"), "changed-etag-from-server"))
}
syncManager.performSync()
assertTrue(syncManager.didGenerateUpload)
assertTrue(syncManager.didListAllRemote)
assertTrue(syncManager.didDownloadRemote)
assertFalse(syncManager.syncResult.hasError())
assertEquals(1, collection.entries.size)
assertEquals("changed-etag-from-server", collection.entries.first().eTag)
}
@Test
fun testPerformSync_NoopOnMemberWithSameETag() = runTest {
val collection = LocalTestCollection().apply {
lastSyncState = SyncState(SyncState.Type.CTAG, "ctag1")
entries += LocalTestResource().apply {
fileName = "downloaded-member.txt"
eTag = "MemberETag1"
}
}
server.enqueue(queryCapabilitiesResponse("ctag2"))
val syncManager = syncManager(collection).apply {
listAllRemoteResult = listOf(
Pair(Response(
server.url("/"),
server.url("/downloaded-member.txt"),
null,
listOf(PropStat(
listOf(
GetETag("\"MemberETag1\"")
),
StatusLine(Protocol.HTTP_1_1, 200, "OK")
)
)), HrefRelation.MEMBER)
)
}
syncManager.performSync()
assertFalse(syncManager.didGenerateUpload)
assertTrue(syncManager.didListAllRemote)
assertFalse(syncManager.didDownloadRemote)
assertFalse(syncManager.syncResult.hasError())
assertEquals(1, collection.entries.size)
assertEquals("MemberETag1", collection.entries.first().eTag)
}
@Test
fun testPerformSync_DownloadNewMember() = runTest {
val collection = LocalTestCollection().apply {
lastSyncState = SyncState(SyncState.Type.CTAG, "old-ctag")
}
server.enqueue(queryCapabilitiesResponse(cTag = "new-ctag"))
val syncManager = syncManager(collection).apply {
listAllRemoteResult = listOf(
Pair(Response(
server.url("/"),
server.url("/new-member.txt"),
null,
listOf(PropStat(
listOf(
GetETag("\"NewMemberETag1\"")
),
StatusLine(Protocol.HTTP_1_1, 200, "OK")
)
)), HrefRelation.MEMBER)
)
assertDownloadRemote = mapOf(Pair(server.url("/new-member.txt"), "NewMemberETag1"))
}
syncManager.performSync()
assertFalse(syncManager.didGenerateUpload)
assertTrue(syncManager.didListAllRemote)
assertTrue(syncManager.didDownloadRemote)
assertFalse(syncManager.syncResult.hasError())
assertEquals(1, collection.entries.size)
assertEquals("NewMemberETag1", collection.entries.first().eTag)
}
@Test
fun testPerformSync_DownloadUpdatedMember() = runTest {
val collection = LocalTestCollection().apply {
lastSyncState = SyncState(SyncState.Type.CTAG, "old-ctag")
entries += LocalTestResource().apply {
fileName = "downloaded-member.txt"
eTag = "MemberETag1"
}
}
server.enqueue(queryCapabilitiesResponse(cTag = "new-ctag"))
val syncManager = syncManager(collection).apply {
listAllRemoteResult = listOf(
Pair(Response(
server.url("/"),
server.url("/downloaded-member.txt"),
null,
listOf(PropStat(
listOf(
GetETag("\"MemberETag2\"")
),
StatusLine(Protocol.HTTP_1_1, 200, "OK")
)
)), HrefRelation.MEMBER)
)
assertDownloadRemote = mapOf(Pair(server.url("/downloaded-member.txt"), "MemberETag2"))
}
syncManager.performSync()
assertFalse(syncManager.didGenerateUpload)
assertTrue(syncManager.didListAllRemote)
assertTrue(syncManager.didDownloadRemote)
assertFalse(syncManager.syncResult.hasError())
assertEquals(1, collection.entries.size)
assertEquals("MemberETag2", collection.entries.first().eTag)
}
@Test
fun testPerformSync_RemoveVanishedMember() = runTest {
val collection = LocalTestCollection().apply {
lastSyncState = SyncState(SyncState.Type.CTAG, "old-ctag")
entries += LocalTestResource().apply {
fileName = "downloaded-member.txt"
}
}
server.enqueue(queryCapabilitiesResponse(cTag = "new-ctag"))
val syncManager = syncManager(collection)
syncManager.performSync()
assertFalse(syncManager.didGenerateUpload)
assertTrue(syncManager.didListAllRemote)
assertFalse(syncManager.didDownloadRemote)
assertFalse(syncManager.syncResult.hasError())
assertTrue(collection.entries.isEmpty())
}
@Test
fun testPerformSync_CTagDidntChange() = runTest {
val collection = LocalTestCollection().apply {
lastSyncState = SyncState(SyncState.Type.CTAG, "ctag1")
}
server.enqueue(queryCapabilitiesResponse("ctag1"))
val syncManager = syncManager(collection)
syncManager.performSync()
assertFalse(syncManager.didGenerateUpload)
assertFalse(syncManager.didListAllRemote)
assertFalse(syncManager.didDownloadRemote)
assertFalse(syncManager.syncResult.hasError())
assertTrue(collection.entries.isEmpty())
}
// helpers
private fun syncManager(
localCollection: LocalTestCollection,
syncResult: SyncResult = SyncResult(),
collection: Collection = mockk<Collection>(relaxed = true) {
every { id } returns 1
every { url } returns server.url("/")
}
) = syncManagerFactory.create(
account,
httpClientBuilder.build(),
syncResult,
localCollection,
collection
)
}

View file

@ -0,0 +1,228 @@
/*
* Copyright © All Contributors. See LICENSE and AUTHORS in the root directory for details.
*/
package at.bitfire.davdroid.sync
import android.accounts.Account
import android.content.ContentProviderClient
import at.bitfire.davdroid.db.Collection
import at.bitfire.davdroid.resource.LocalDataStore
import io.mockk.every
import io.mockk.impl.annotations.InjectMockKs
import io.mockk.impl.annotations.RelaxedMockK
import io.mockk.impl.annotations.SpyK
import io.mockk.junit4.MockKRule
import io.mockk.just
import io.mockk.mockk
import io.mockk.runs
import io.mockk.verify
import org.junit.Assert.assertArrayEquals
import org.junit.Assert.assertEquals
import org.junit.Assert.assertTrue
import org.junit.Rule
import org.junit.Test
import java.util.logging.Logger
class SyncerTest {
@get:Rule
val mockkRule = MockKRule(this)
@RelaxedMockK
lateinit var logger: Logger
val dataStore: LocalTestStore = mockk(relaxed = true)
val provider: ContentProviderClient = mockk(relaxed = true)
@SpyK
@InjectMockKs
var syncer = TestSyncer(mockk(relaxed = true), null, SyncResult(), dataStore)
@Test
fun testSync_prepare_fails() {
every { syncer.prepare(provider) } returns false
every { syncer.getSyncEnabledCollections() } returns emptyMap()
// Should stop the sync after prepare returns false
syncer.sync(provider)
verify(exactly = 1) { syncer.prepare(provider) }
verify(exactly = 0) { syncer.getSyncEnabledCollections() }
}
@Test
fun testSync_prepare_succeeds() {
every { syncer.prepare(provider) } returns true
every { syncer.getSyncEnabledCollections() } returns emptyMap()
// Should continue the sync after prepare returns true
syncer.sync(provider)
verify(exactly = 1) { syncer.prepare(provider) }
verify(exactly = 1) { syncer.getSyncEnabledCollections() }
}
@Test
fun testUpdateCollections_deletesCollection() {
val localCollection = mockk<LocalTestCollection> {
every { dbCollectionId } returns 0L
every { title } returns "Collection to be deleted locally"
}
// Should delete the localCollection if dbCollection (remote) does not exist
val localCollections = mutableListOf(localCollection)
val result = syncer.updateCollections(mockk(), localCollections, emptyMap())
verify(exactly = 1) { dataStore.delete(localCollection) }
// Updated local collection list should be empty
assertTrue(result.isEmpty())
}
@Test
fun testUpdateCollections_updatesCollection() {
val localCollection = mockk<LocalTestCollection> {
every { dbCollectionId } returns 0L
every { title } returns "The Local Collection"
}
val dbCollection = mockk<Collection> {
every { id } returns 0L
}
val dbCollections = mapOf(0L to dbCollection)
// Should update the localCollection if it exists
val result = syncer.updateCollections(provider, listOf(localCollection), dbCollections)
verify(exactly = 1) { dataStore.update(provider, localCollection, dbCollection) }
// Updated local collection list should be same as input
assertArrayEquals(arrayOf(localCollection), result.toTypedArray())
}
@Test
fun testUpdateCollections_findsNewCollection() {
val dbCollection = mockk<Collection> {
every { id } returns 0L
}
val localCollections = listOf(mockk<LocalTestCollection> {
every { dbCollectionId } returns 0L
})
val dbCollections = listOf(dbCollection)
val dbCollectionsMap = mapOf(dbCollection.id to dbCollection)
every { syncer.createLocalCollections(provider, dbCollections) } returns localCollections
// Should return the new collection, because it was not updated
val result = syncer.updateCollections(provider, emptyList(), dbCollectionsMap)
// Updated local collection list contain new entry
assertEquals(1, result.size)
assertEquals(dbCollection.id, result[0].dbCollectionId)
}
@Test
fun testCreateLocalCollections() {
val localCollection = mockk<LocalTestCollection>()
val dbCollection = mockk<Collection>()
every { dataStore.create(provider, dbCollection) } returns localCollection
// Should return list of newly created local collections
val result = syncer.createLocalCollections(provider, listOf(dbCollection))
assertEquals(listOf(localCollection), result)
}
@Test
fun testSyncCollectionContents() {
val dbCollection1 = mockk<Collection>()
val dbCollection2 = mockk<Collection>()
val dbCollections = mapOf(
0L to dbCollection1,
1L to dbCollection2
)
val localCollection1 = mockk<LocalTestCollection> { every { dbCollectionId } returns 0L }
val localCollection2 = mockk<LocalTestCollection> { every { dbCollectionId } returns 1L }
val localCollections = listOf(localCollection1, localCollection2)
every { localCollection1.dbCollectionId } returns 0L
every { localCollection2.dbCollectionId } returns 1L
every { syncer.syncCollection(provider, any(), any()) } just runs
// Should call the collection content sync on both collections
syncer.syncCollectionContents(provider, localCollections, dbCollections)
verify(exactly = 1) { syncer.syncCollection(provider, localCollection1, dbCollection1) }
verify(exactly = 1) { syncer.syncCollection(provider, localCollection2, dbCollection2) }
}
// Test helpers
class TestSyncer(
account: Account,
resyncType: ResyncType?,
syncResult: SyncResult,
theDataStore: LocalTestStore
) : Syncer<LocalTestStore, LocalTestCollection>(account, resyncType, syncResult) {
override val dataStore: LocalTestStore =
theDataStore
override val serviceType: String
get() = throw NotImplementedError()
override fun prepare(provider: ContentProviderClient): Boolean =
throw NotImplementedError()
override fun getDbSyncCollections(serviceId: Long): List<Collection> =
throw NotImplementedError()
override fun syncCollection(
provider: ContentProviderClient,
localCollection: LocalTestCollection,
remoteCollection: Collection
) {
throw NotImplementedError()
}
}
class LocalTestStore : LocalDataStore<LocalTestCollection> {
override val authority: String
get() = throw NotImplementedError()
override fun acquireContentProvider(throwOnMissingPermissions: Boolean): ContentProviderClient? {
throw NotImplementedError()
}
override fun create(
provider: ContentProviderClient,
fromCollection: Collection
): LocalTestCollection? {
throw NotImplementedError()
}
override fun getAll(
account: Account,
provider: ContentProviderClient
): List<LocalTestCollection> {
throw NotImplementedError()
}
override fun update(
provider: ContentProviderClient,
localCollection: LocalTestCollection,
fromCollection: Collection
) {
throw NotImplementedError()
}
override fun delete(localCollection: LocalTestCollection) {
throw NotImplementedError()
}
override fun updateAccount(oldAccount: Account, newAccount: Account) {
throw NotImplementedError()
}
}
}

View file

@ -0,0 +1,123 @@
/*
* Copyright © All Contributors. See LICENSE and AUTHORS in the root directory for details.
*/
package at.bitfire.davdroid.sync
import android.accounts.Account
import at.bitfire.dav4jvm.DavCollection
import at.bitfire.dav4jvm.MultiResponseCallback
import at.bitfire.dav4jvm.Response
import at.bitfire.dav4jvm.property.caldav.GetCTag
import at.bitfire.davdroid.db.Collection
import at.bitfire.davdroid.di.SyncDispatcher
import at.bitfire.davdroid.network.HttpClient
import at.bitfire.davdroid.resource.LocalResource
import at.bitfire.davdroid.resource.SyncState
import at.bitfire.davdroid.util.DavUtils.lastSegment
import dagger.assisted.Assisted
import dagger.assisted.AssistedFactory
import dagger.assisted.AssistedInject
import kotlinx.coroutines.CoroutineDispatcher
import okhttp3.HttpUrl
import okhttp3.RequestBody
import okhttp3.RequestBody.Companion.toRequestBody
import org.junit.Assert.assertEquals
class TestSyncManager @AssistedInject constructor(
@Assisted account: Account,
@Assisted httpClient: HttpClient,
@Assisted syncResult: SyncResult,
@Assisted localCollection: LocalTestCollection,
@Assisted collection: Collection,
@SyncDispatcher syncDispatcher: CoroutineDispatcher
): SyncManager<LocalTestResource, LocalTestCollection, DavCollection>(
account,
httpClient,
SyncDataType.EVENTS,
syncResult,
localCollection,
collection,
resync = null,
syncDispatcher
) {
@AssistedFactory
interface Factory {
fun create(
account: Account,
httpClient: HttpClient,
syncResult: SyncResult,
localCollection: LocalTestCollection,
collection: Collection
): TestSyncManager
}
override fun prepare(): Boolean {
davCollection = DavCollection(httpClient.okHttpClient, collection.url)
return true
}
var didQueryCapabilities = false
override suspend fun queryCapabilities(): SyncState? {
if (didQueryCapabilities)
throw IllegalStateException("queryCapabilities() must not be called twice")
didQueryCapabilities = true
var cTag: SyncState? = null
davCollection.propfind(0, GetCTag.NAME) { response, rel ->
if (rel == Response.HrefRelation.SELF)
response[GetCTag::class.java]?.cTag?.let {
cTag = SyncState(SyncState.Type.CTAG, it)
}
}
return cTag
}
var didGenerateUpload = false
override fun generateUpload(resource: LocalTestResource): RequestBody {
didGenerateUpload = true
return resource.toString().toRequestBody()
}
override fun syncAlgorithm() = SyncAlgorithm.PROPFIND_REPORT
var listAllRemoteResult = emptyList<Pair<Response, Response.HrefRelation>>()
var didListAllRemote = false
override suspend fun listAllRemote(callback: MultiResponseCallback) {
if (didListAllRemote)
throw IllegalStateException("listAllRemote() must not be called twice")
didListAllRemote = true
for (result in listAllRemoteResult)
callback.onResponse(result.first, result.second)
}
var assertDownloadRemote = emptyMap<HttpUrl, String>()
var didDownloadRemote = false
override suspend fun downloadRemote(bunch: List<HttpUrl>) {
didDownloadRemote = true
assertEquals(assertDownloadRemote.keys.toList(), bunch)
for ((url, eTag) in assertDownloadRemote) {
val fileName = url.lastSegment
var localEntry = localCollection.entries.firstOrNull { it.fileName == fileName }
if (localEntry == null) {
val newEntry = LocalTestResource().also {
it.fileName = fileName
}
localCollection.entries += newEntry
localEntry = newEntry
}
localEntry.eTag = eTag
localEntry.flags = LocalResource.FLAG_REMOTELY_PRESENT
}
}
override fun postProcess() {
}
override fun notifyInvalidResourceTitle() =
throw NotImplementedError()
}

View file

@ -0,0 +1,163 @@
/*
* Copyright © All Contributors. See LICENSE and AUTHORS in the root directory for details.
*/
package at.bitfire.davdroid.sync.account
import android.accounts.Account
import android.accounts.AccountManager
import android.content.Context
import android.os.Bundle
import androidx.hilt.work.HiltWorkerFactory
import androidx.work.testing.TestListenableWorkerBuilder
import at.bitfire.davdroid.R
import at.bitfire.davdroid.TestUtils
import at.bitfire.davdroid.db.AppDatabase
import at.bitfire.davdroid.db.Service
import at.bitfire.davdroid.resource.LocalAddressBook
import at.bitfire.davdroid.settings.SettingsManager
import dagger.hilt.android.qualifiers.ApplicationContext
import dagger.hilt.android.testing.HiltAndroidRule
import dagger.hilt.android.testing.HiltAndroidTest
import org.junit.After
import org.junit.Assert.assertEquals
import org.junit.Assert.assertNotNull
import org.junit.Assert.assertNull
import org.junit.Assert.assertTrue
import org.junit.Before
import org.junit.Rule
import org.junit.Test
import javax.inject.Inject
@HiltAndroidTest
class AccountsCleanupWorkerTest {
@get:Rule
val hiltRule = HiltAndroidRule(this)
@Inject
lateinit var accountsCleanupWorkerFactory: AccountsCleanupWorker.Factory
@Inject @ApplicationContext
lateinit var context: Context
@Inject
lateinit var db: AppDatabase
@Inject
lateinit var settingsManager: SettingsManager
@Inject
lateinit var workerFactory: HiltWorkerFactory
lateinit var accountManager: AccountManager
lateinit var addressBookAccountType: String
lateinit var addressBookAccount: Account
lateinit var service: Service
@Before
fun setUp() {
hiltRule.inject()
TestUtils.setUpWorkManager(context, workerFactory)
accountManager = AccountManager.get(context)
service = createTestService()
addressBookAccountType = context.getString(R.string.account_type_address_book)
addressBookAccount = Account("Fancy address book account", addressBookAccountType)
}
@After
fun tearDown() {
// Remove the account here in any case; Nice to have when the test fails
accountManager.removeAccountExplicitly(addressBookAccount)
}
@Test
fun testCleanUpServices_noAccount() {
// Insert service that reference to invalid account
db.serviceDao().insertOrReplace(Service(id = 1, accountName = "test", type = Service.TYPE_CALDAV, principal = null))
assertNotNull(db.serviceDao().get(1))
// Create worker and run the method
val worker = TestListenableWorkerBuilder<AccountsCleanupWorker>(context)
.setWorkerFactory(workerFactory)
.build()
worker.cleanUpServices()
// Verify that service is deleted
assertNull(db.serviceDao().get(1))
}
@Test
fun testCleanUpServices_oneAccount() {
TestAccount.provide { existingAccount ->
// Insert services, one that reference the existing account and one that references an invalid account
db.serviceDao().insertOrReplace(Service(id = 1, accountName = existingAccount.name, type = Service.TYPE_CALDAV, principal = null))
assertNotNull(db.serviceDao().get(1))
db.serviceDao().insertOrReplace(Service(id = 2, accountName = "not existing", type = Service.TYPE_CARDDAV, principal = null))
assertNotNull(db.serviceDao().get(2))
// Create worker and run the method
val worker = TestListenableWorkerBuilder<AccountsCleanupWorker>(context)
.setWorkerFactory(workerFactory)
.build()
worker.cleanUpServices()
// Verify that one service is deleted and the other one is kept
assertNotNull(db.serviceDao().get(1))
assertNull(db.serviceDao().get(2))
}
}
@Test
fun testCleanUpAddressBooks_deletesAddressBookWithoutAccount() {
// Create address book account without corresponding account
assertTrue(accountManager.addAccountExplicitly(addressBookAccount, null, null))
assertEquals(listOf(addressBookAccount), accountManager.getAccountsByType(addressBookAccountType).toList())
// Create worker and run the method
val worker = TestListenableWorkerBuilder<AccountsCleanupWorker>(context)
.setWorkerFactory(workerFactory)
.build()
worker.cleanUpAddressBooks()
// Verify account was deleted
assertTrue(accountManager.getAccountsByType(addressBookAccountType).isEmpty())
}
@Test
fun testCleanUpAddressBooks_keepsAddressBookWithAccount() {
TestAccount.provide { existingAccount ->
// Create address book account _with_ corresponding account and verify
val userData = Bundle(2).apply {
putString(LocalAddressBook.USER_DATA_ACCOUNT_NAME, existingAccount.name)
putString(LocalAddressBook.USER_DATA_ACCOUNT_TYPE, existingAccount.type)
}
assertTrue(accountManager.addAccountExplicitly(addressBookAccount, null, userData))
assertEquals(listOf(addressBookAccount), accountManager.getAccountsByType(addressBookAccountType).toList())
// Create worker and run the method
val worker = TestListenableWorkerBuilder<AccountsCleanupWorker>(context)
.setWorkerFactory(workerFactory)
.build()
worker.cleanUpAddressBooks()
// Verify account was _not_ deleted
assertEquals(listOf(addressBookAccount), accountManager.getAccountsByType(addressBookAccountType).toList())
}
}
// helpers
private fun createTestService(): Service {
val service = Service(id=0, accountName="test", type=Service.TYPE_CARDDAV, principal = null)
val serviceId = db.serviceDao().insertOrReplace(service)
return db.serviceDao().get(serviceId)!!
}
}

View file

@ -0,0 +1,60 @@
/*
* Copyright © All Contributors. See LICENSE and AUTHORS in the root directory for details.
*/
package at.bitfire.davdroid.sync.account
import android.accounts.Account
import android.accounts.AccountManager
import android.content.Context
import android.os.Bundle
import at.bitfire.davdroid.R
import at.bitfire.davdroid.settings.SettingsManager
import dagger.hilt.android.qualifiers.ApplicationContext
import dagger.hilt.android.testing.HiltAndroidRule
import dagger.hilt.android.testing.HiltAndroidTest
import org.junit.Assert.assertEquals
import org.junit.Assert.assertTrue
import org.junit.Before
import org.junit.Rule
import org.junit.Test
import javax.inject.Inject
@HiltAndroidTest
class SystemAccountUtilsTest {
@get:Rule
val hiltRule = HiltAndroidRule(this)
@Inject @ApplicationContext
lateinit var context: Context
@Inject
lateinit var settingsManager: SettingsManager
@Before
fun setUp() {
hiltRule.inject()
}
@Test
fun testCreateAccount() {
val userData = Bundle(2)
userData.putString("int", "1")
userData.putString("string", "abc/\"-")
val account = Account("AccountUtilsTest", context.getString(R.string.account_type))
val manager = AccountManager.get(context)
try {
assertTrue(SystemAccountUtils.createAccount(context, account, userData))
// validate user data
assertEquals("1", manager.getUserData(account, "int"))
assertEquals("abc/\"-", manager.getUserData(account, "string"))
} finally {
assertTrue(manager.removeAccountExplicitly(account))
}
}
}

View file

@ -0,0 +1,53 @@
/*
* Copyright © All Contributors. See LICENSE and AUTHORS in the root directory for details.
*/
package at.bitfire.davdroid.sync.account
import android.accounts.Account
import android.accounts.AccountManager
import androidx.test.platform.app.InstrumentationRegistry
import at.bitfire.davdroid.R
import at.bitfire.davdroid.settings.AccountSettings
import org.junit.Assert.assertTrue
object TestAccount {
private val targetContext by lazy { InstrumentationRegistry.getInstrumentation().targetContext }
/**
* Creates a test account, usually in the `Before` setUp of a test.
*
* Remove it with [remove].
*/
fun create(version: Int = AccountSettings.CURRENT_VERSION, accountName: String = "Test Account"): Account {
val accountType = targetContext.getString(R.string.account_type)
val account = Account(accountName, accountType)
val initialData = AccountSettings.initialUserData(null)
initialData.putString(AccountSettings.KEY_SETTINGS_VERSION, version.toString())
assertTrue(SystemAccountUtils.createAccount(targetContext, account, initialData))
return account
}
/**
* Removes a test account, usually in the `@After` tearDown of a test.
*/
fun remove(account: Account) {
val am = AccountManager.get(targetContext)
assertTrue(am.removeAccountExplicitly(account))
}
/**
* Convenience method to create a test account and remove it after executing the block.
*/
fun provide(version: Int = AccountSettings.CURRENT_VERSION, block: (Account) -> Unit) {
val account = create(version)
try {
block(account)
} finally {
remove(account)
}
}
}

View file

@ -0,0 +1,95 @@
/*
* Copyright © All Contributors. See LICENSE and AUTHORS in the root directory for details.
*/
package at.bitfire.davdroid.sync.worker
import android.accounts.Account
import android.content.Context
import androidx.work.ListenableWorker
import androidx.work.WorkManager
import androidx.work.WorkerFactory
import androidx.work.WorkerParameters
import androidx.work.testing.TestListenableWorkerBuilder
import androidx.work.workDataOf
import at.bitfire.davdroid.R
import at.bitfire.davdroid.TestUtils
import at.bitfire.davdroid.sync.SyncDataType
import at.bitfire.davdroid.sync.account.TestAccount
import dagger.hilt.android.qualifiers.ApplicationContext
import dagger.hilt.android.testing.HiltAndroidRule
import dagger.hilt.android.testing.HiltAndroidTest
import io.mockk.junit4.MockKRule
import io.mockk.mockkObject
import io.mockk.verify
import kotlinx.coroutines.test.runTest
import org.junit.After
import org.junit.Assert.assertTrue
import org.junit.Before
import org.junit.Rule
import org.junit.Test
import javax.inject.Inject
@HiltAndroidTest
class PeriodicSyncWorkerTest {
@Inject @ApplicationContext
lateinit var context: Context
@Inject
lateinit var syncWorkerFactory: PeriodicSyncWorker.Factory
@get:Rule
val hiltRule = HiltAndroidRule(this)
@get:Rule
val mockkRule = MockKRule(this)
lateinit var account: Account
@Before
fun setUp() {
hiltRule.inject()
TestUtils.setUpWorkManager(context)
account = TestAccount.create()
}
@After
fun tearDown() {
TestAccount.remove(account)
}
@Test
fun doWork_cancelsItselfOnInvalidAccount() = runTest {
val invalidAccount = Account("invalid", context.getString(R.string.account_type))
// Run PeriodicSyncWorker as TestWorker
val inputData = workDataOf(
BaseSyncWorker.INPUT_DATA_TYPE to SyncDataType.EVENTS.toString(),
BaseSyncWorker.INPUT_ACCOUNT_NAME to invalidAccount.name,
BaseSyncWorker.INPUT_ACCOUNT_TYPE to invalidAccount.type
)
// observe WorkManager cancellation call
val workManager = WorkManager.getInstance(context)
mockkObject(workManager)
// run test worker, expect failure
val testWorker = TestListenableWorkerBuilder<PeriodicSyncWorker>(context, inputData)
.setWorkerFactory(object: WorkerFactory() {
override fun createWorker(appContext: Context, workerClassName: String, workerParameters: WorkerParameters) =
syncWorkerFactory.create(appContext, workerParameters)
})
.build()
val result = testWorker.doWork()
assertTrue(result is ListenableWorker.Result.Failure)
// verify that worker called WorkManager.cancelWorkById(<its ID>)
verify {
workManager.cancelWorkById(testWorker.id)
}
}
}

View file

@ -0,0 +1,280 @@
/*
* Copyright © All Contributors. See LICENSE and AUTHORS in the root directory for details.
*/
package at.bitfire.davdroid.sync.worker
import android.accounts.Account
import android.content.Context
import android.net.ConnectivityManager
import android.net.Network
import android.net.NetworkCapabilities
import android.net.NetworkCapabilities.NET_CAPABILITY_INTERNET
import android.net.NetworkCapabilities.NET_CAPABILITY_NOT_VPN
import android.net.NetworkCapabilities.NET_CAPABILITY_VALIDATED
import android.net.NetworkCapabilities.TRANSPORT_WIFI
import android.net.wifi.WifiInfo
import android.net.wifi.WifiManager
import androidx.core.content.getSystemService
import at.bitfire.davdroid.settings.AccountSettings
import at.bitfire.davdroid.sync.SyncConditions
import at.bitfire.davdroid.util.PermissionUtils
import dagger.hilt.android.qualifiers.ApplicationContext
import dagger.hilt.android.testing.HiltAndroidRule
import dagger.hilt.android.testing.HiltAndroidTest
import io.mockk.every
import io.mockk.impl.annotations.MockK
import io.mockk.junit4.MockKRule
import io.mockk.mockk
import io.mockk.mockkObject
import io.mockk.spyk
import org.junit.Assert.assertFalse
import org.junit.Assert.assertTrue
import org.junit.Before
import org.junit.Rule
import org.junit.Test
import javax.inject.Inject
@HiltAndroidTest
class SyncConditionsTest {
@get:Rule
val hiltRule = HiltAndroidRule(this)
@get:Rule
val mockkRule = MockKRule(this)
@MockK
lateinit var capabilities: NetworkCapabilities
@Inject
@ApplicationContext
lateinit var context: Context
@Inject
lateinit var factory: SyncConditions.Factory
@MockK
lateinit var network1: Network
@MockK
lateinit var network2: Network
private lateinit var accountSettings: AccountSettings
private lateinit var conditions: SyncConditions
private lateinit var connectivityManager: ConnectivityManager
@Before
fun setup() {
hiltRule.inject()
// prepare accountSettings with some necessary data
accountSettings = mockk<AccountSettings> {
every { account } returns Account("test", "test")
every { getIgnoreVpns() } returns false // default value
}
conditions = factory.create(accountSettings)
connectivityManager = context.getSystemService<ConnectivityManager>()!!.also { cm ->
mockkObject(cm)
every { cm.allNetworks } returns arrayOf(network1, network2)
every { cm.getNetworkInfo(network1) } returns mockk()
every { cm.getNetworkInfo(network2) } returns mockk()
every { cm.getNetworkCapabilities(network1) } returns capabilities
every { cm.getNetworkCapabilities(network2) } returns capabilities
}
}
@Test
fun testCorrectWifiSsid_CorrectWiFiSsid() {
every { accountSettings.getSyncWifiOnlySSIDs() } returns listOf("SampleWiFi1","ConnectedWiFi")
mockkObject(PermissionUtils)
every { PermissionUtils.canAccessWifiSsid(any()) } returns true
val wifiManager = context.getSystemService<WifiManager>()!!
mockkObject(wifiManager)
every { wifiManager.connectionInfo } returns spyk<WifiInfo>().apply {
every { ssid } returns "ConnectedWiFi"
}
assertTrue(conditions.correctWifiSsid())
}
@Test
fun testCorrectWifiSsid_WrongWiFiSsid() {
every { accountSettings.getSyncWifiOnlySSIDs() } returns listOf("SampleWiFi1","SampleWiFi2")
mockkObject(PermissionUtils)
every { PermissionUtils.canAccessWifiSsid(any()) } returns true
val wifiManager = context.getSystemService<WifiManager>()!!
mockkObject(wifiManager)
every { wifiManager.connectionInfo } returns spyk<WifiInfo>().apply {
every { ssid } returns "ConnectedWiFi"
}
assertFalse(conditions.correctWifiSsid())
}
@Test
fun testInternetAvailable_capabilitiesNull() {
every { connectivityManager.getNetworkCapabilities(network1) } returns null
every { connectivityManager.getNetworkCapabilities(network2) } returns null
assertFalse(conditions.internetAvailable())
}
@Test
fun testInternetAvailable_Internet() {
every { capabilities.hasCapability(NET_CAPABILITY_INTERNET) } returns true
every { capabilities.hasCapability(NET_CAPABILITY_VALIDATED) } returns false
assertFalse(conditions.internetAvailable())
}
@Test
fun testInternetAvailable_Validated() {
every { capabilities.hasCapability(NET_CAPABILITY_INTERNET) } returns false
every { capabilities.hasCapability(NET_CAPABILITY_VALIDATED) } returns true
assertFalse(conditions.internetAvailable())
}
@Test
fun testInternetAvailable_InternetValidated() {
every { capabilities.hasCapability(NET_CAPABILITY_INTERNET) } returns true
every { capabilities.hasCapability(NET_CAPABILITY_VALIDATED) } returns true
assertTrue(conditions.internetAvailable())
}
@Test
fun testInternetAvailable_ignoreVpns() {
every { accountSettings.getIgnoreVpns() } returns true
every { capabilities.hasCapability(NET_CAPABILITY_INTERNET) } returns true
every { capabilities.hasCapability(NET_CAPABILITY_VALIDATED) } returns true
every { capabilities.hasCapability(NET_CAPABILITY_NOT_VPN) } returns false
assertFalse(conditions.internetAvailable())
}
@Test
fun testInternetAvailable_ignoreVpns_NotVpn() {
every { accountSettings.getIgnoreVpns() } returns true
every { capabilities.hasCapability(NET_CAPABILITY_INTERNET) } returns true
every { capabilities.hasCapability(NET_CAPABILITY_VALIDATED) } returns true
every { capabilities.hasCapability(NET_CAPABILITY_NOT_VPN) } returns true
assertTrue(conditions.internetAvailable())
}
@Test
fun testInternetAvailable_twoConnectionsFirstOneWithoutInternet() {
// The real case that failed in davx5-ose#395 is that the connection list contains (in this order)
// 1. a mobile network without INTERNET, but with VALIDATED
// 2. a WiFi network with INTERNET and VALIDATED
// The "return false" of hasINTERNET will trigger at the first connection, the
// "andThen true" will trigger for the second connection
every { capabilities.hasCapability(NET_CAPABILITY_INTERNET) } returns false andThen true
every { capabilities.hasCapability(NET_CAPABILITY_VALIDATED) } returns true
// There is an internet connection if any(!) connection has both INTERNET and VALIDATED.
assertTrue(conditions.internetAvailable())
}
@Test
fun testInternetAvailable_twoConnectionsFirstOneWithoutValidated() {
every { capabilities.hasCapability(NET_CAPABILITY_INTERNET) } returns true
every { capabilities.hasCapability(NET_CAPABILITY_VALIDATED) } returns false andThen true
assertTrue(conditions.internetAvailable())
}
@Test
fun testInternetAvailable_twoConnectionsFirstOneWithoutNotVpn() {
every { accountSettings.getIgnoreVpns() } returns true
every { capabilities.hasCapability(NET_CAPABILITY_INTERNET) } returns true
every { capabilities.hasCapability(NET_CAPABILITY_VALIDATED) } returns true
every { capabilities.hasCapability(NET_CAPABILITY_NOT_VPN) } returns false andThen true
assertTrue(conditions.internetAvailable())
}
@Test
fun testWifiAvailable_capabilitiesNull() {
every { connectivityManager.getNetworkCapabilities(network1) } returns null
every { connectivityManager.getNetworkCapabilities(network2) } returns null
assertFalse(conditions.wifiAvailable())
}
@Test
fun testWifiAvailable() {
every { capabilities.hasTransport(TRANSPORT_WIFI) } returns false
every { capabilities.hasCapability(NET_CAPABILITY_VALIDATED) } returns false
assertFalse(conditions.wifiAvailable())
}
@Test
fun testWifiAvailable_wifi() {
every { capabilities.hasTransport(TRANSPORT_WIFI) } returns true
every { capabilities.hasCapability(NET_CAPABILITY_VALIDATED) } returns false
assertFalse(conditions.wifiAvailable())
}
@Test
fun testWifiAvailable_validated() {
every { capabilities.hasTransport(TRANSPORT_WIFI) } returns false
every { capabilities.hasCapability(NET_CAPABILITY_VALIDATED) } returns true
assertFalse(conditions.wifiAvailable())
}
@Test
fun testWifiAvailable_wifiValidated() {
every { capabilities.hasTransport(TRANSPORT_WIFI) } returns true
every { capabilities.hasCapability(NET_CAPABILITY_VALIDATED) } returns true
assertTrue(conditions.wifiAvailable())
}
@Test
fun testWifiConditionsMet_withoutWifi() {
// "Sync only over Wi-Fi" is disabled
every { accountSettings.getSyncWifiOnly() } returns false
assertTrue(factory.create(accountSettings).wifiConditionsMet())
}
@Test
fun testWifiConditionsMet_anyWifi_wifiEnabled() {
// "Sync only over Wi-Fi" is enabled
every { accountSettings.getSyncWifiOnly() } returns true
// Wi-Fi is available
mockkObject(conditions) {
// Wi-Fi is available
every { conditions.wifiAvailable() } returns true
// Wi-Fi SSID is correct
every { conditions.correctWifiSsid() } returns true
assertTrue(conditions.wifiConditionsMet())
}
}
@Test
fun testWifiConditionsMet_anyWifi_wifiDisabled() {
// "Sync only over Wi-Fi" is enabled
every { accountSettings.getSyncWifiOnly() } returns true
mockkObject(conditions) {
// Wi-Fi is not available
every { conditions.wifiAvailable() } returns false
// Wi-Fi SSID is correct
every { conditions.correctWifiSsid() } returns true
assertFalse(conditions.wifiConditionsMet())
}
}
}

View file

@ -0,0 +1,90 @@
/*
* Copyright © All Contributors. See LICENSE and AUTHORS in the root directory for details.
*/
package at.bitfire.davdroid.sync.worker
import android.accounts.Account
import android.content.Context
import androidx.hilt.work.HiltWorkerFactory
import at.bitfire.davdroid.TestUtils
import at.bitfire.davdroid.TestUtils.workScheduledOrRunning
import at.bitfire.davdroid.sync.SyncDataType
import at.bitfire.davdroid.sync.account.TestAccount
import dagger.hilt.android.qualifiers.ApplicationContext
import dagger.hilt.android.testing.HiltAndroidRule
import dagger.hilt.android.testing.HiltAndroidTest
import org.junit.After
import org.junit.Assert.assertEquals
import org.junit.Assert.assertFalse
import org.junit.Assert.assertTrue
import org.junit.Before
import org.junit.Rule
import org.junit.Test
import javax.inject.Inject
@HiltAndroidTest
class SyncWorkerManagerTest {
@get:Rule
val hiltRule = HiltAndroidRule(this)
@Inject
@ApplicationContext
lateinit var context: Context
@Inject
lateinit var syncWorkerManager: SyncWorkerManager
@Inject
lateinit var workerFactory: HiltWorkerFactory
lateinit var account: Account
@Before
fun setUp() {
hiltRule.inject()
TestUtils.setUpWorkManager(context, workerFactory)
account = TestAccount.create()
}
@After
fun tearDown() {
TestAccount.remove(account)
}
// one-time sync workers
@Test
fun testEnqueueOneTime() {
val workerName = OneTimeSyncWorker.workerName(account, SyncDataType.EVENTS)
assertFalse(TestUtils.workScheduledOrRunningOrSuccessful(context, workerName))
val returnedName = syncWorkerManager.enqueueOneTime(account, SyncDataType.EVENTS)
assertEquals(workerName, returnedName)
assertTrue(TestUtils.workScheduledOrRunningOrSuccessful(context, workerName))
}
// periodic sync workers
@Test
fun enablePeriodic() {
syncWorkerManager.enablePeriodic(account, SyncDataType.EVENTS, 60, false).result.get()
val workerName = PeriodicSyncWorker.workerName(account, SyncDataType.EVENTS)
assertTrue(workScheduledOrRunning(context, workerName))
}
@Test
fun disablePeriodic() {
syncWorkerManager.enablePeriodic(account, SyncDataType.EVENTS, 60, false).result.get()
syncWorkerManager.disablePeriodic(account, SyncDataType.EVENTS).result.get()
val workerName = PeriodicSyncWorker.workerName(account, SyncDataType.EVENTS)
assertFalse(workScheduledOrRunning(context, workerName))
}
}

View file

@ -0,0 +1,94 @@
/*
* Copyright © All Contributors. See LICENSE and AUTHORS in the root directory for details.
*/
package at.bitfire.davdroid.ui
import at.bitfire.davdroid.db.Collection
import at.bitfire.davdroid.db.Service
import at.bitfire.davdroid.push.PushRegistrationManager
import at.bitfire.davdroid.repository.DavCollectionRepository
import at.bitfire.davdroid.repository.DavServiceRepository
import at.bitfire.davdroid.sync.worker.SyncWorkerManager
import dagger.hilt.android.testing.BindValue
import dagger.hilt.android.testing.HiltAndroidRule
import dagger.hilt.android.testing.HiltAndroidTest
import io.mockk.coVerify
import io.mockk.impl.annotations.RelaxedMockK
import io.mockk.junit4.MockKRule
import kotlinx.coroutines.ExperimentalCoroutinesApi
import kotlinx.coroutines.test.advanceUntilIdle
import kotlinx.coroutines.test.runTest
import okhttp3.HttpUrl.Companion.toHttpUrl
import org.junit.After
import org.junit.Before
import org.junit.Rule
import org.junit.Test
import javax.inject.Inject
@OptIn(ExperimentalCoroutinesApi::class)
@HiltAndroidTest
class CollectionSelectedUseCaseTest {
@get:Rule
val hiltRule = HiltAndroidRule(this)
@get:Rule
val mockkRule = MockKRule(this)
val collection = Collection(
id = 2,
serviceId = 1,
type = Collection.Companion.TYPE_CALENDAR,
url = "https://example.com".toHttpUrl()
)
@Inject
lateinit var collectionRepository: DavCollectionRepository
val service = Service(
id = 1,
type = Service.Companion.TYPE_CALDAV,
accountName = "test@example.com"
)
@BindValue
@RelaxedMockK
lateinit var pushRegistrationManager: PushRegistrationManager
@Inject
lateinit var serviceRepository: DavServiceRepository
@BindValue
@RelaxedMockK
lateinit var syncWorkerManager: SyncWorkerManager
@Inject
lateinit var useCase: CollectionSelectedUseCase
@Before
fun setUp() {
hiltRule.inject()
serviceRepository.insertOrReplaceBlocking(service)
collectionRepository.insertOrUpdateByUrl(collection)
}
@After
fun tearDown() {
serviceRepository.deleteAllBlocking()
}
@Test
fun testHandleWithDelay() = runTest {
useCase.handleWithDelay(collectionId = collection.id)
advanceUntilIdle()
coVerify {
syncWorkerManager.enqueueOneTimeAllAuthorities(any())
pushRegistrationManager.update(service.id)
}
}
}

View file

@ -0,0 +1,37 @@
/*
* Copyright © All Contributors. See LICENSE and AUTHORS in the root directory for details.
*/
package at.bitfire.davdroid.ui
import androidx.test.platform.app.InstrumentationRegistry
import org.junit.Assert.assertEquals
import org.junit.Test
class DebugInfoActivityTest {
@Test
fun testIntentBuilder_LargeLocalResource() {
val a = 'A'.code.toByte()
val intent = DebugInfoActivity.IntentBuilder(InstrumentationRegistry.getInstrumentation().context)
.withLocalResource(String(ByteArray(1024*1024) { a }))
.build()
val expected = StringBuilder(DebugInfoActivity.IntentBuilder.MAX_ELEMENT_SIZE)
expected.append(String(ByteArray(DebugInfoActivity.IntentBuilder.MAX_ELEMENT_SIZE - 3) { a }))
expected.append("...")
assertEquals(expected.toString(), intent.getStringExtra(DebugInfoActivity.EXTRA_LOCAL_RESOURCE_SUMMARY))
}
@Test
fun testIntentBuilder_LargeLogs() {
val a = 'A'.code.toByte()
val intent = DebugInfoActivity.IntentBuilder(InstrumentationRegistry.getInstrumentation().context)
.withLogs(String(ByteArray(1024*1024) { a }))
.build()
val expected = StringBuilder(DebugInfoActivity.IntentBuilder.MAX_ELEMENT_SIZE)
expected.append(String(ByteArray(DebugInfoActivity.IntentBuilder.MAX_ELEMENT_SIZE - 3) { a }))
expected.append("...")
assertEquals(expected.toString(), intent.getStringExtra("logs"))
}
}

View file

@ -0,0 +1,67 @@
/*
* Copyright © All Contributors. See LICENSE and AUTHORS in the root directory for details.
*/
package at.bitfire.davdroid.ui.setup
import android.content.Intent
import android.net.Uri
import org.junit.Assert.assertEquals
import org.junit.Test
class LoginActivityTest {
@Test
fun loginInfoFromIntent() {
val intent = Intent().apply {
data = Uri.parse("https://example.com/nextcloud")
putExtra(LoginActivity.EXTRA_USERNAME, "user")
putExtra(LoginActivity.EXTRA_PASSWORD, "password")
}
val loginInfo = LoginActivity.loginInfoFromIntent(intent)
assertEquals("https://example.com/nextcloud", loginInfo.baseUri.toString())
assertEquals("user", loginInfo.credentials!!.username)
assertEquals("password", loginInfo.credentials.password?.asString())
}
@Test
fun loginInfoFromIntent_withPort() {
val intent = Intent().apply {
data = Uri.parse("https://example.com:444/nextcloud")
putExtra(LoginActivity.EXTRA_USERNAME, "user")
putExtra(LoginActivity.EXTRA_PASSWORD, "password")
}
val loginInfo = LoginActivity.loginInfoFromIntent(intent)
assertEquals("https://example.com:444/nextcloud", loginInfo.baseUri.toString())
assertEquals("user", loginInfo.credentials!!.username)
assertEquals("password", loginInfo.credentials.password?.asString())
}
@Test
fun loginInfoFromIntent_implicit() {
val intent = Intent(Intent.ACTION_VIEW, Uri.parse("davx5://user:password@example.com/path"))
val loginInfo = LoginActivity.loginInfoFromIntent(intent)
assertEquals("https://example.com/path", loginInfo.baseUri.toString())
assertEquals("user", loginInfo.credentials!!.username)
assertEquals("password", loginInfo.credentials.password?.asString())
}
@Test
fun loginInfoFromIntent_implicit_withPort() {
val intent = Intent(Intent.ACTION_VIEW, Uri.parse("davx5://user:password@example.com:0/path"))
val loginInfo = LoginActivity.loginInfoFromIntent(intent)
assertEquals("https://example.com:0/path", loginInfo.baseUri.toString())
assertEquals("user", loginInfo.credentials!!.username)
assertEquals("password", loginInfo.credentials.password?.asString())
}
@Test
fun loginInfoFromIntent_implicit_email() {
val intent = Intent(Intent.ACTION_VIEW, Uri.parse("mailto:user@example.com"))
val loginInfo = LoginActivity.loginInfoFromIntent(intent)
assertEquals(null, loginInfo.baseUri)
assertEquals("user@example.com", loginInfo.credentials!!.username)
assertEquals(null, loginInfo.credentials.password?.asString())
}
}

View file

@ -0,0 +1,41 @@
/*
* Copyright © All Contributors. See LICENSE and AUTHORS in the root directory for details.
*/
package at.bitfire.davdroid.webdav
import at.bitfire.davdroid.settings.Credentials
import at.bitfire.davdroid.util.SensitiveString.Companion.toSensitiveString
import dagger.hilt.android.testing.HiltAndroidRule
import dagger.hilt.android.testing.HiltAndroidTest
import org.junit.Assert.assertEquals
import org.junit.Assert.assertNull
import org.junit.Before
import org.junit.Rule
import org.junit.Test
import javax.inject.Inject
@HiltAndroidTest
class CredentialsStoreTest {
@get:Rule
val hiltRule = HiltAndroidRule(this)
@Inject
lateinit var store: CredentialsStore
@Before
fun setUp() {
hiltRule.inject()
}
@Test
fun testSetGetDelete() {
store.setCredentials(0, Credentials(username = "myname", password = "12345".toSensitiveString()))
assertEquals(Credentials(username = "myname", password = "12345".toSensitiveString()), store.getCredentials(0))
store.setCredentials(0, null)
assertNull(store.getCredentials(0))
}
}

View file

@ -0,0 +1,66 @@
/*
* Copyright © All Contributors. See LICENSE and AUTHORS in the root directory for details.
*/
package at.bitfire.davdroid.webdav
import dagger.hilt.android.testing.HiltAndroidRule
import dagger.hilt.android.testing.HiltAndroidTest
import junit.framework.TestCase.assertEquals
import kotlinx.coroutines.test.runTest
import okhttp3.mockwebserver.MockResponse
import okhttp3.mockwebserver.MockWebServer
import org.junit.Assert.assertNull
import org.junit.Before
import org.junit.Rule
import org.junit.Test
import javax.inject.Inject
@HiltAndroidTest
class WebDavMountRepositoryTest {
@get:Rule
val hiltRule = HiltAndroidRule(this)
@Inject
lateinit var repository: WebDavMountRepository
@Before
fun setUp() {
hiltRule.inject()
}
val web = MockWebServer()
val url = web.url("/")
@Test
fun testHasWebDav_NoDavHeader() = runTest {
web.enqueue(MockResponse().setResponseCode(200))
assertNull(repository.hasWebDav(url, null))
}
@Test
fun testHasWebDav_DavClass1() = runTest {
web.enqueue(MockResponse()
.setResponseCode(200)
.addHeader("DAV: 1"))
assertEquals(url, repository.hasWebDav(url, null))
}
@Test
fun testHasWebDav_DavClass2() = runTest {
web.enqueue(MockResponse()
.setResponseCode(200)
.addHeader("DAV: 1, 2"))
assertEquals(url,repository.hasWebDav(url, null))
}
@Test
fun testHasWebDav_DavClass3() = runTest {
web.enqueue(MockResponse()
.setResponseCode(200)
.addHeader("DAV: 1, 3"))
assertEquals(url,repository.hasWebDav(url, null))
}
}

View file

@ -0,0 +1,247 @@
/*
* Copyright © All Contributors. See LICENSE and AUTHORS in the root directory for details.
*/
package at.bitfire.davdroid.webdav.operation
import android.content.Context
import android.security.NetworkSecurityPolicy
import at.bitfire.davdroid.db.AppDatabase
import at.bitfire.davdroid.db.WebDavDocument
import at.bitfire.davdroid.db.WebDavMount
import at.bitfire.davdroid.network.HttpClient
import dagger.hilt.android.qualifiers.ApplicationContext
import dagger.hilt.android.testing.HiltAndroidRule
import dagger.hilt.android.testing.HiltAndroidTest
import io.mockk.junit4.MockKRule
import kotlinx.coroutines.runBlocking
import kotlinx.coroutines.test.runTest
import okhttp3.mockwebserver.Dispatcher
import okhttp3.mockwebserver.MockResponse
import okhttp3.mockwebserver.MockWebServer
import okhttp3.mockwebserver.RecordedRequest
import org.junit.After
import org.junit.Assert.assertEquals
import org.junit.Assert.assertTrue
import org.junit.Before
import org.junit.Rule
import org.junit.Test
import java.util.logging.Logger
import javax.inject.Inject
@HiltAndroidTest
class QueryChildDocumentsOperationTest {
@get:Rule
val hiltRule = HiltAndroidRule(this)
@get:Rule
val mockkRule = MockKRule(this)
@Inject @ApplicationContext
lateinit var context: Context
@Inject
lateinit var db: AppDatabase
@Inject
lateinit var operation: QueryChildDocumentsOperation
@Inject
lateinit var httpClientBuilder: HttpClient.Builder
@Inject
lateinit var testDispatcher: TestDispatcher
private lateinit var server: MockWebServer
private lateinit var client: HttpClient
private lateinit var mount: WebDavMount
private lateinit var rootDocument: WebDavDocument
@Before
fun setUp() {
hiltRule.inject()
// create server and client
server = MockWebServer().apply {
dispatcher = testDispatcher
start()
}
client = httpClientBuilder.build()
// mock server delivers HTTP without encryption
assertTrue(NetworkSecurityPolicy.getInstance().isCleartextTrafficPermitted)
// create WebDAV mount and root document in DB
runBlocking {
val mountId = db.webDavMountDao().insert(WebDavMount(0, "Cat food storage", server.url(PATH_WEBDAV_ROOT)))
mount = db.webDavMountDao().getById(mountId)
rootDocument = db.webDavDocumentDao().getOrCreateRoot(mount)
}
}
@After
fun tearDown() {
client.close()
server.shutdown()
runBlocking {
db.webDavMountDao().deleteAsync(mount)
}
}
@Test
fun testDoQueryChildren_insert() = runTest {
// Query
operation.queryChildren(rootDocument)
// Assert new children were inserted into db
assertEquals(3, db.webDavDocumentDao().getChildren(rootDocument.id).size)
assertEquals("Library", db.webDavDocumentDao().getChildren(rootDocument.id)[0].displayName)
assertEquals("MeowMeow_Cats.docx", db.webDavDocumentDao().getChildren(rootDocument.id)[1].displayName)
assertEquals("Secret_Document.pages", db.webDavDocumentDao().getChildren(rootDocument.id)[2].displayName)
}
@Test
fun testDoQueryChildren_update() = runTest {
// Create parent and root in database
assertEquals("Cat food storage", db.webDavDocumentDao().get(rootDocument.id)!!.displayName)
// Create a folder
val folderId = db.webDavDocumentDao().insert(
WebDavDocument(
0,
mount.id,
rootDocument.id,
"My_Books",
true,
"My Books",
)
)
assertEquals("My_Books", db.webDavDocumentDao().get(folderId)!!.name)
assertEquals("My Books", db.webDavDocumentDao().get(folderId)!!.displayName)
// Query - should update the parent displayname and folder name
operation.queryChildren(rootDocument)
// Assert parent and children were updated in database
assertEquals("Cats WebDAV", db.webDavDocumentDao().get(rootDocument.id)!!.displayName)
assertEquals("Library", db.webDavDocumentDao().getChildren(rootDocument.id)[0].name)
assertEquals("Library", db.webDavDocumentDao().getChildren(rootDocument.id)[0].displayName)
}
@Test
fun testDoQueryChildren_delete() = runTest {
// Create a folder
val folderId = db.webDavDocumentDao().insert(
WebDavDocument(0, mount.id, rootDocument.id, "deleteme", true, "Should be deleted")
)
assertEquals("deleteme", db.webDavDocumentDao().get(folderId)!!.name)
// Query - discovers serverside deletion
operation.queryChildren(rootDocument)
// Assert folder got deleted
assertEquals(null, db.webDavDocumentDao().get(folderId))
}
@Test
fun testDoQueryChildren_updateTwoDirectoriesSimultaneously() = runTest {
// Create two directories
val parent1Id = db.webDavDocumentDao().insert(WebDavDocument(0, mount.id, rootDocument.id, "parent1", true))
val parent2Id = db.webDavDocumentDao().insert(WebDavDocument(0, mount.id, rootDocument.id, "parent2", true))
val parent1 = db.webDavDocumentDao().get(parent1Id)!!
val parent2 = db.webDavDocumentDao().get(parent2Id)!!
assertEquals("parent1", parent1.name)
assertEquals("parent2", parent2.name)
// Query - find children of two nodes simultaneously
operation.queryChildren(parent1)
operation.queryChildren(parent2)
// Assert the two folders names have changed
assertEquals("childOne.txt", db.webDavDocumentDao().getChildren(parent1Id)[0].name)
assertEquals("childTwo.txt", db.webDavDocumentDao().getChildren(parent2Id)[0].name)
}
// mock server
class TestDispatcher @Inject constructor(
private val logger: Logger
): Dispatcher() {
data class Resource(
val name: String,
val props: String
)
override fun dispatch(request: RecordedRequest): MockResponse {
logger.info("Request: $request")
val requestPath = request.path!!.trimEnd('/')
if (request.method.equals("PROPFIND", true)) {
val propsMap = mutableMapOf(
PATH_WEBDAV_ROOT to arrayOf(
Resource("",
"<resourcetype><collection/></resourcetype>" +
"<displayname>Cats WebDAV</displayname>"
),
Resource("Secret_Document.pages",
"<displayname>Secret_Document.pages</displayname>",
),
Resource("MeowMeow_Cats.docx",
"<displayname>MeowMeow_Cats.docx</displayname>"
),
Resource("Library",
"<resourcetype><collection/></resourcetype>" +
"<displayname>Library</displayname>"
)
),
"$PATH_WEBDAV_ROOT/parent1" to arrayOf(
Resource("childOne.txt",
"<displayname>childOne.txt</displayname>"
),
),
"$PATH_WEBDAV_ROOT/parent2" to arrayOf(
Resource("childTwo.txt",
"<displayname>childTwo.txt</displayname>"
)
)
)
val responses = propsMap[requestPath]?.joinToString { resource ->
"<response><href>$requestPath/${resource.name}</href><propstat><prop>" +
resource.props +
"</prop></propstat></response>"
}
val multistatus =
"<multistatus xmlns='DAV:' " +
"xmlns:CARD='urn:ietf:params:xml:ns:carddav' " +
"xmlns:CAL='urn:ietf:params:xml:ns:caldav'>" +
responses +
"</multistatus>"
logger.info("Response: $multistatus")
return MockResponse()
.setResponseCode(207)
.setBody(multistatus)
}
return MockResponse().setResponseCode(404)
}
}
companion object {
private const val PATH_WEBDAV_ROOT = "/webdav"
}
}

Binary file not shown.

After

Width:  |  Height:  |  Size: 9.2 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 2.7 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 5.1 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 14 KiB

View file

@ -0,0 +1,6 @@
<?xml version="1.0" encoding="utf-8"?>
<resources>
<string name="app_name">Davx5Test</string>
</resources>

View file

@ -0,0 +1,353 @@
<?xml version="1.0" encoding="utf-8"?>
<!--
~ Copyright © All Contributors. See LICENSE and AUTHORS in the root directory for details.
-->
<manifest xmlns:android="http://schemas.android.com/apk/res/android"
xmlns:tools="http://schemas.android.com/tools"
android:installLocation="internalOnly">
<!-- normal permissions -->
<uses-permission android:name="android.permission.ACCESS_NETWORK_STATE"/>
<uses-permission android:name="android.permission.ACCESS_WIFI_STATE"/>
<uses-permission android:name="android.permission.INTERNET"/>
<uses-permission android:name="android.permission.POST_NOTIFICATIONS"/>
<uses-permission android:name="android.permission.READ_SYNC_SETTINGS"/>
<uses-permission android:name="android.permission.READ_SYNC_STATS"/>
<uses-permission android:name="android.permission.WRITE_SYNC_SETTINGS"/>
<uses-permission android:name="android.permission.RECEIVE_BOOT_COMPLETED"/>
<uses-permission android:name="android.permission.REQUEST_IGNORE_BATTERY_OPTIMIZATIONS"/>
<!-- other permissions -->
<!-- android.permission-group.CONTACTS -->
<uses-permission android:name="android.permission.READ_CONTACTS"/>
<uses-permission android:name="android.permission.WRITE_CONTACTS"/>
<!-- android.permission-group.CALENDAR -->
<uses-permission android:name="android.permission.READ_CALENDAR"/>
<uses-permission android:name="android.permission.WRITE_CALENDAR"/>
<!-- android.permission-group.LOCATION -->
<!-- getting the WiFi name (for "sync in Wifi only") requires
- coarse location (Android 8.1)
- fine location (Android 10) -->
<uses-permission-sdk-23 android:name="android.permission.ACCESS_COARSE_LOCATION"/>
<uses-permission-sdk-23 android:name="android.permission.ACCESS_FINE_LOCATION"/>
<!-- required since Android 10 to get the WiFi name while in background (= while syncing) -->
<uses-permission-sdk-23 android:name="android.permission.ACCESS_BACKGROUND_LOCATION"/>
<!-- ical4android declares task access permissions -->
<!-- Disable GPS capability requirement, which is implicitly derived from ACCESS_FINE_LOCATION
permission and makes app unusable on some devices without GPS. We need location permissions only
to get the current WiFi SSID, and we don't need GPS for that. -->
<uses-feature android:name="android.hardware.location.gps" android:required="false" />
<application
android:name=".App"
android:allowBackup="false"
android:networkSecurityConfig="@xml/network_security_config"
android:icon="@mipmap/ic_launcher"
android:label="@string/app_name"
android:theme="@style/Theme.AppCompat.DayNight.NoActionBar"
android:resizeableActivity="true"
tools:ignore="UnusedAttribute"
android:supportsRtl="true">
<!-- Required for Hilt/WorkManager integration. See
- https://developer.android.com/develop/background-work/background-tasks/persistent/configuration/custom-configuration#remove-default
- https://developer.android.com/training/dependency-injection/hilt-jetpack#workmanager
However, we must not disable AndroidX startup completely, as it's needed by other libraries like okhttp. -->
<provider
android:name="androidx.startup.InitializationProvider"
android:authorities="${applicationId}.androidx-startup"
android:exported="false"
tools:node="merge">
<meta-data
android:name="androidx.work.WorkManagerInitializer"
android:value="androidx.startup"
tools:node="remove" />
</provider>
<!-- Remove the node added by AppAuth (remove only from net.openid.appauth library, not from our flavor manifest files) -->
<activity android:name="net.openid.appauth.RedirectUriReceiverActivity"
tools:node="remove" tools:selector="net.openid.appauth"/>
<activity android:name=".ui.intro.IntroActivity" />
<activity
android:name=".ui.AccountsActivity"
android:exported="true">
<intent-filter>
<action android:name="android.intent.action.MAIN"/>
<category android:name="android.intent.category.LAUNCHER"/>
</intent-filter>
</activity>
<activity
android:name=".ui.AboutActivity"
android:label="@string/navigation_drawer_about"
android:parentActivityName=".ui.AccountsActivity"/>
<activity
android:name=".ui.AppSettingsActivity"
android:label="@string/app_settings"
android:parentActivityName=".ui.AccountsActivity"
android:exported="true">
<intent-filter>
<action android:name="android.intent.action.APPLICATION_PREFERENCES"/>
<category android:name="android.intent.category.DEFAULT" />
</intent-filter>
</activity>
<activity
android:name=".ui.DebugInfoActivity"
android:parentActivityName=".ui.AppSettingsActivity"
android:exported="false"
android:label="@string/debug_info_title">
<intent-filter>
<action android:name="android.intent.action.BUG_REPORT"/>
<category android:name="android.intent.category.DEFAULT" />
</intent-filter>
</activity>
<activity
android:name=".ui.PermissionsActivity"
android:label="@string/app_settings_security_app_permissions"
android:parentActivityName=".ui.AppSettingsActivity" />
<activity
android:name=".ui.TasksActivity"
android:label="@string/intro_tasks_title"
android:parentActivityName=".ui.AppSettingsActivity" />
<activity
android:name=".ui.setup.LoginActivity"
android:parentActivityName=".ui.AccountsActivity"
android:windowSoftInputMode="adjustResize"
android:exported="true">
<intent-filter>
<action android:name="android.intent.action.MAIN"/>
</intent-filter>
<intent-filter>
<action android:name="android.intent.action.VIEW"/>
<category android:name="android.intent.category.BROWSABLE"/>
<category android:name="android.intent.category.DEFAULT"/>
<data android:scheme="caldav"/>
<data android:scheme="caldavs"/>
<data android:scheme="carddav"/>
<data android:scheme="carddavs"/>
<data android:scheme="davx5"/>
</intent-filter>
<intent-filter>
<action android:name="loginFlow" /> <!-- Ensures this filter matches, even if the sending app is not defining an action -->
<category android:name="android.intent.category.DEFAULT" />
<data
tools:ignore="AppLinkUrlError"
android:scheme="http" />
<data android:scheme="https" />
</intent-filter>
</activity>
<activity
android:name=".ui.account.AccountActivity"
android:parentActivityName=".ui.AccountsActivity"
android:exported="true">
</activity>
<activity
android:name=".ui.account.CollectionActivity"
android:parentActivityName=".ui.account.AccountActivity" />
<activity
android:name=".ui.account.CreateAddressBookActivity"
android:parentActivityName=".ui.account.AccountActivity" />
<activity
android:name=".ui.account.CreateCalendarActivity"
android:parentActivityName=".ui.account.AccountActivity" />
<activity
android:name=".ui.account.AccountSettingsActivity"
android:parentActivityName=".ui.account.AccountActivity" />
<activity
android:name=".ui.account.WifiPermissionsActivity"
android:parentActivityName=".ui.account.AccountSettingsActivity" />
<activity
android:name=".ui.webdav.WebdavMountsActivity"
android:exported="true"
android:parentActivityName=".ui.AccountsActivity" />
<activity
android:name=".ui.webdav.AddWebdavMountActivity"
android:parentActivityName=".ui.webdav.WebdavMountsActivity"
android:windowSoftInputMode="adjustResize" />
<!-- account type "DAVx⁵" -->
<service
android:name=".sync.account.AccountAuthenticatorService"
android:exported="false">
<intent-filter>
<action android:name="android.accounts.AccountAuthenticator"/>
</intent-filter>
<meta-data
android:name="android.accounts.AccountAuthenticator"
android:resource="@xml/account_authenticator"/>
</service>
<service
android:name=".sync.adapter.CalendarsSyncAdapterService"
android:exported="true"
tools:ignore="ExportedService">
<intent-filter>
<action android:name="android.content.SyncAdapter"/>
</intent-filter>
<meta-data
android:name="android.content.SyncAdapter"
android:resource="@xml/sync_calendars"/>
</service>
<service
android:name=".sync.adapter.JtxSyncAdapterService"
android:exported="true"
tools:ignore="ExportedService">
<intent-filter>
<action android:name="android.content.SyncAdapter"/>
</intent-filter>
<meta-data
android:name="android.content.SyncAdapter"
android:resource="@xml/sync_notes"/>
</service>
<service
android:name=".sync.adapter.OpenTasksSyncAdapterService"
android:exported="true"
tools:ignore="ExportedService">
<intent-filter>
<action android:name="android.content.SyncAdapter"/>
</intent-filter>
<meta-data
android:name="android.content.SyncAdapter"
android:resource="@xml/sync_opentasks"/>
</service>
<service
android:name=".sync.adapter.TasksOrgSyncAdapterService"
android:exported="true"
tools:ignore="ExportedService">
<intent-filter>
<action android:name="android.content.SyncAdapter"/>
</intent-filter>
<meta-data
android:name="android.content.SyncAdapter"
android:resource="@xml/sync_tasks_org"/>
</service>
<provider
android:authorities="@string/webdav_authority"
android:name=".webdav.DavDocumentsProvider"
android:exported="true"
android:grantUriPermissions="true"
android:permission="android.permission.MANAGE_DOCUMENTS">
<intent-filter>
<action android:name="android.content.action.DOCUMENTS_PROVIDER" />
</intent-filter>
</provider>
<!-- account type "DAVx⁵ Address book" -->
<service
android:name=".sync.account.AddressBookAuthenticatorService"
android:exported="true"
tools:ignore="ExportedService"> <!-- Since Android 11, this must be true so that Google Contacts shows the address book accounts -->
<intent-filter>
<action android:name="android.accounts.AccountAuthenticator"/>
</intent-filter>
<meta-data
android:name="android.accounts.AccountAuthenticator"
android:resource="@xml/account_authenticator_address_book"/>
</service>
<service
android:name=".sync.adapter.ContactsSyncAdapterService"
android:exported="true"
tools:ignore="ExportedService">
<intent-filter>
<action android:name="android.content.SyncAdapter"/>
</intent-filter>
<meta-data
android:name="android.content.SyncAdapter"
android:resource="@xml/sync_contacts"/>
<meta-data
android:name="android.provider.CONTACTS_STRUCTURE"
android:resource="@xml/contacts"/>
</service>
<!-- provider to share debug info/logs -->
<provider
android:name="androidx.core.content.FileProvider"
android:authorities="@string/authority_debug_provider"
android:grantUriPermissions="true"
android:exported="false">
<meta-data
android:name="android.support.FILE_PROVIDER_PATHS"
android:resource="@xml/debug_paths" />
</provider>
<!-- UnifiedPush -->
<service android:exported="false" android:name=".push.UnifiedPushService">
<intent-filter>
<action android:name="org.unifiedpush.android.connector.PUSH_EVENT"/>
</intent-filter>
</service>
<!-- Widgets -->
<receiver android:name=".ui.widget.LabeledSyncButtonWidgetReceiver"
android:label="@string/widget_labeled_sync_label"
android:exported="true">
<intent-filter>
<action android:name="android.appwidget.action.APPWIDGET_UPDATE" />
</intent-filter>
<meta-data
android:name="android.appwidget.provider"
android:resource="@xml/widget_info_labeled_sync_button" />
</receiver>
<receiver android:name=".ui.widget.IconSyncButtonWidgetReceiver"
android:label="@string/widget_icon_sync_label"
android:exported="true">
<intent-filter>
<action android:name="android.appwidget.action.APPWIDGET_UPDATE" />
</intent-filter>
<meta-data
android:name="android.appwidget.provider"
android:resource="@xml/widget_info_icon_sync_button" />
</receiver>
</application>
<!-- package visiblity which apps do we need to see? -->
<queries>
<!-- system providers (listing them is technically not required, but some apps like the
Huawei calendar take this as indication of whether these providers are accessed) -->
<provider android:authorities="com.android.calendar"/>
<provider android:authorities="com.android.contacts"/>
<!-- task providers -->
<package android:name="at.techbee.jtx" />
<package android:name="org.dmfs.tasks" />
<package android:name="org.tasks" />
<!-- ICSx5 for Webcal feeds -->
<package android:name="at.bitfire.icsdroid"/>
<!-- apps that interact with contact, calendar, task data (for debug info) -->
<intent>
<action android:name="*" />
<data android:scheme="content" android:host="com.android.contacts" />
</intent>
<intent>
<action android:name="*" />
<data android:scheme="content" android:host="com.android.calendar" />
</intent>
<!-- Open URLs in a browser or other app [https://developer.android.com/training/package-visibility/use-cases#open-urls-browser-or-other-app] -->
<intent>
<action android:name="android.intent.action.VIEW" />
<data android:scheme="https" />
</intent>
<!-- Custom Tabs support (e.g. Nextcloud Login Flow) -->
<intent>
<action android:name="android.support.customtabs.action.CustomTabsService" />
</intent>
</queries>
</manifest>

View file

@ -0,0 +1,628 @@
<h3 style="text-align: center;">GNU GENERAL PUBLIC LICENSE</h3>
<p style="text-align: center;">Version 3, 29 June 2007</p>
<p>Copyright &copy; 2007 Free Software Foundation, Inc.
&lt;<a href="http://fsf.org/">http://fsf.org/</a>&gt;</p><p>
Everyone is permitted to copy and distribute verbatim copies
of this license document, but changing it is not allowed.</p>
<h3><a name="preamble"></a>Preamble</h3>
<p>The GNU General Public License is a free, copyleft license for
software and other kinds of works.</p>
<p>The licenses for most software and other practical works are designed
to take away your freedom to share and change the works. By contrast,
the GNU General Public License is intended to guarantee your freedom to
share and change all versions of a program--to make sure it remains free
software for all its users. We, the Free Software Foundation, use the
GNU General Public License for most of our software; it applies also to
any other work released this way by its authors. You can apply it to
your programs, too.</p>
<p>When we speak of free software, we are referring to freedom, not
price. Our General Public Licenses are designed to make sure that you
have the freedom to distribute copies of free software (and charge for
them if you wish), that you receive source code or can get it if you
want it, that you can change the software or use pieces of it in new
free programs, and that you know you can do these things.</p>
<p>To protect your rights, we need to prevent others from denying you
these rights or asking you to surrender the rights. Therefore, you have
certain responsibilities if you distribute copies of the software, or if
you modify it: responsibilities to respect the freedom of others.</p>
<p>For example, if you distribute copies of such a program, whether
gratis or for a fee, you must pass on to the recipients the same
freedoms that you received. You must make sure that they, too, receive
or can get the source code. And you must show them these terms so they
know their rights.</p>
<p>Developers that use the GNU GPL protect your rights with two steps:
(1) assert copyright on the software, and (2) offer you this License
giving you legal permission to copy, distribute and/or modify it.</p>
<p>For the developers\' and authors\' protection, the GPL clearly explains
that there is no warranty for this free software. For both users\' and
authors\' sake, the GPL requires that modified versions be marked as
changed, so that their problems will not be attributed erroneously to
authors of previous versions.</p>
<p>Some devices are designed to deny users access to install or run
modified versions of the software inside them, although the manufacturer
can do so. This is fundamentally incompatible with the aim of
protecting users\' freedom to change the software. The systematic
pattern of such abuse occurs in the area of products for individuals to
use, which is precisely where it is most unacceptable. Therefore, we
have designed this version of the GPL to prohibit the practice for those
products. If such problems arise substantially in other domains, we
stand ready to extend this provision to those domains in future versions
of the GPL, as needed to protect the freedom of users.</p>
<p>Finally, every program is threatened constantly by software patents.
States should not allow patents to restrict development and use of
software on general-purpose computers, but in those that do, we wish to
avoid the special danger that patents applied to a free program could
make it effectively proprietary. To prevent this, the GPL assures that
patents cannot be used to render the program non-free.</p>
<p>The precise terms and conditions for copying, distribution and
modification follow.</p>
<h3><a name="terms"></a>TERMS AND CONDITIONS</h3>
<h4><a name="section0"></a>0. Definitions.</h4>
<p>&ldquo;This License&rdquo; refers to version 3 of the GNU General Public License.</p>
<p>&ldquo;Copyright&rdquo; also means copyright-like laws that apply to other kinds of
works, such as semiconductor masks.</p>
<p>&ldquo;The Program&rdquo; refers to any copyrightable work licensed under this
License. Each licensee is addressed as &ldquo;you&rdquo;. &ldquo;Licensees&rdquo; and
&ldquo;recipients&rdquo; may be individuals or organizations.</p>
<p>To &ldquo;modify&rdquo; a work means to copy from or adapt all or part of the work
in a fashion requiring copyright permission, other than the making of an
exact copy. The resulting work is called a &ldquo;modified version&rdquo; of the
earlier work or a work &ldquo;based on&rdquo; the earlier work.</p>
<p>A &ldquo;covered work&rdquo; means either the unmodified Program or a work based
on the Program.</p>
<p>To &ldquo;propagate&rdquo; a work means to do anything with it that, without
permission, would make you directly or secondarily liable for
infringement under applicable copyright law, except executing it on a
computer or modifying a private copy. Propagation includes copying,
distribution (with or without modification), making available to the
public, and in some countries other activities as well.</p>
<p>To &ldquo;convey&rdquo; a work means any kind of propagation that enables other
parties to make or receive copies. Mere interaction with a user through
a computer network, with no transfer of a copy, is not conveying.</p>
<p>An interactive user interface displays &ldquo;Appropriate Legal Notices&rdquo;
to the extent that it includes a convenient and prominently visible
feature that (1) displays an appropriate copyright notice, and (2)
tells the user that there is no warranty for the work (except to the
extent that warranties are provided), that licensees may convey the
work under this License, and how to view a copy of this License. If
the interface presents a list of user commands or options, such as a
menu, a prominent item in the list meets this criterion.</p>
<h4><a name="section1"></a>1. Source Code.</h4>
<p>The &ldquo;source code&rdquo; for a work means the preferred form of the work
for making modifications to it. &ldquo;Object code&rdquo; means any non-source
form of a work.</p>
<p>A &ldquo;Standard Interface&rdquo; means an interface that either is an official
standard defined by a recognized standards body, or, in the case of
interfaces specified for a particular programming language, one that
is widely used among developers working in that language.</p>
<p>The &ldquo;System Libraries&rdquo; of an executable work include anything, other
than the work as a whole, that (a) is included in the normal form of
packaging a Major Component, but which is not part of that Major
Component, and (b) serves only to enable use of the work with that
Major Component, or to implement a Standard Interface for which an
implementation is available to the public in source code form. A
&ldquo;Major Component&rdquo;, in this context, means a major essential component
(kernel, window system, and so on) of the specific operating system
(if any) on which the executable work runs, or a compiler used to
produce the work, or an object code interpreter used to run it.</p>
<p>The &ldquo;Corresponding Source&rdquo; for a work in object code form means all
the source code needed to generate, install, and (for an executable
work) run the object code and to modify the work, including scripts to
control those activities. However, it does not include the work\'s
System Libraries, or general-purpose tools or generally available free
programs which are used unmodified in performing those activities but
which are not part of the work. For example, Corresponding Source
includes interface definition files associated with source files for
the work, and the source code for shared libraries and dynamically
linked subprograms that the work is specifically designed to require,
such as by intimate data communication or control flow between those
subprograms and other parts of the work.</p>
<p>The Corresponding Source need not include anything that users
can regenerate automatically from other parts of the Corresponding
Source.</p>
<p>The Corresponding Source for a work in source code form is that
same work.</p>
<h4><a name="section2"></a>2. Basic Permissions.</h4>
<p>All rights granted under this License are granted for the term of
copyright on the Program, and are irrevocable provided the stated
conditions are met. This License explicitly affirms your unlimited
permission to run the unmodified Program. The output from running a
covered work is covered by this License only if the output, given its
content, constitutes a covered work. This License acknowledges your
rights of fair use or other equivalent, as provided by copyright law.</p>
<p>You may make, run and propagate covered works that you do not
convey, without conditions so long as your license otherwise remains
in force. You may convey covered works to others for the sole purpose
of having them make modifications exclusively for you, or provide you
with facilities for running those works, provided that you comply with
the terms of this License in conveying all material for which you do
not control copyright. Those thus making or running the covered works
for you must do so exclusively on your behalf, under your direction
and control, on terms that prohibit them from making any copies of
your copyrighted material outside their relationship with you.</p>
<p>Conveying under any other circumstances is permitted solely under
the conditions stated below. Sublicensing is not allowed; section 10
makes it unnecessary.</p>
<h4><a name="section3"></a>3. Protecting Users\' Legal Rights From Anti-Circumvention Law.</h4>
<p>No covered work shall be deemed part of an effective technological
measure under any applicable law fulfilling obligations under article
11 of the WIPO copyright treaty adopted on 20 December 1996, or
similar laws prohibiting or restricting circumvention of such
measures.</p>
<p>When you convey a covered work, you waive any legal power to forbid
circumvention of technological measures to the extent such circumvention
is effected by exercising rights under this License with respect to
the covered work, and you disclaim any intention to limit operation or
modification of the work as a means of enforcing, against the work\'s
users, your or third parties\' legal rights to forbid circumvention of
technological measures.</p>
<h4><a name="section4"></a>4. Conveying Verbatim Copies.</h4>
<p>You may convey verbatim copies of the Program\'s source code as you
receive it, in any medium, provided that you conspicuously and
appropriately publish on each copy an appropriate copyright notice;
keep intact all notices stating that this License and any
non-permissive terms added in accord with section 7 apply to the code;
keep intact all notices of the absence of any warranty; and give all
recipients a copy of this License along with the Program.</p>
<p>You may charge any price or no price for each copy that you convey,
and you may offer support or warranty protection for a fee.</p>
<h4><a name="section5"></a>5. Conveying Modified Source Versions.</h4>
<p>You may convey a work based on the Program, or the modifications to
produce it from the Program, in the form of source code under the
terms of section 4, provided that you also meet all of these conditions:</p>
<ul>
<li>a) The work must carry prominent notices stating that you modified
it, and giving a relevant date.</li>
<li>b) The work must carry prominent notices stating that it is
released under this License and any conditions added under section
7. This requirement modifies the requirement in section 4 to
&ldquo;keep intact all notices&rdquo;.</li>
<li>c) You must license the entire work, as a whole, under this
License to anyone who comes into possession of a copy. This
License will therefore apply, along with any applicable section 7
additional terms, to the whole of the work, and all its parts,
regardless of how they are packaged. This License gives no
permission to license the work in any other way, but it does not
invalidate such permission if you have separately received it.</li>
<li>d) If the work has interactive user interfaces, each must display
Appropriate Legal Notices; however, if the Program has interactive
interfaces that do not display Appropriate Legal Notices, your
work need not make them do so.</li>
</ul>
<p>A compilation of a covered work with other separate and independent
works, which are not by their nature extensions of the covered work,
and which are not combined with it such as to form a larger program,
in or on a volume of a storage or distribution medium, is called an
&ldquo;aggregate&rdquo; if the compilation and its resulting copyright are not
used to limit the access or legal rights of the compilation\'s users
beyond what the individual works permit. Inclusion of a covered work
in an aggregate does not cause this License to apply to the other
parts of the aggregate.</p>
<h4><a name="section6"></a>6. Conveying Non-Source Forms.</h4>
<p>You may convey a covered work in object code form under the terms
of sections 4 and 5, provided that you also convey the
machine-readable Corresponding Source under the terms of this License,
in one of these ways:</p>
<ul>
<li>a) Convey the object code in, or embodied in, a physical product
(including a physical distribution medium), accompanied by the
Corresponding Source fixed on a durable physical medium
customarily used for software interchange.</li>
<li>b) Convey the object code in, or embodied in, a physical product
(including a physical distribution medium), accompanied by a
written offer, valid for at least three years and valid for as
long as you offer spare parts or customer support for that product
model, to give anyone who possesses the object code either (1) a
copy of the Corresponding Source for all the software in the
product that is covered by this License, on a durable physical
medium customarily used for software interchange, for a price no
more than your reasonable cost of physically performing this
conveying of source, or (2) access to copy the
Corresponding Source from a network server at no charge.</li>
<li>c) Convey individual copies of the object code with a copy of the
written offer to provide the Corresponding Source. This
alternative is allowed only occasionally and noncommercially, and
only if you received the object code with such an offer, in accord
with subsection 6b.</li>
<li>d) Convey the object code by offering access from a designated
place (gratis or for a charge), and offer equivalent access to the
Corresponding Source in the same way through the same place at no
further charge. You need not require recipients to copy the
Corresponding Source along with the object code. If the place to
copy the object code is a network server, the Corresponding Source
may be on a different server (operated by you or a third party)
that supports equivalent copying facilities, provided you maintain
clear directions next to the object code saying where to find the
Corresponding Source. Regardless of what server hosts the
Corresponding Source, you remain obligated to ensure that it is
available for as long as needed to satisfy these requirements.</li>
<li>e) Convey the object code using peer-to-peer transmission, provided
you inform other peers where the object code and Corresponding
Source of the work are being offered to the general public at no
charge under subsection 6d.</li>
</ul>
<p>A separable portion of the object code, whose source code is excluded
from the Corresponding Source as a System Library, need not be
included in conveying the object code work.</p>
<p>A &ldquo;User Product&rdquo; is either (1) a &ldquo;consumer product&rdquo;, which means any
tangible personal property which is normally used for personal, family,
or household purposes, or (2) anything designed or sold for incorporation
into a dwelling. In determining whether a product is a consumer product,
doubtful cases shall be resolved in favor of coverage. For a particular
product received by a particular user, &ldquo;normally used&rdquo; refers to a
typical or common use of that class of product, regardless of the status
of the particular user or of the way in which the particular user
actually uses, or expects or is expected to use, the product. A product
is a consumer product regardless of whether the product has substantial
commercial, industrial or non-consumer uses, unless such uses represent
the only significant mode of use of the product.</p>
<p>&ldquo;Installation Information&rdquo; for a User Product means any methods,
procedures, authorization keys, or other information required to install
and execute modified versions of a covered work in that User Product from
a modified version of its Corresponding Source. The information must
suffice to ensure that the continued functioning of the modified object
code is in no case prevented or interfered with solely because
modification has been made.</p>
<p>If you convey an object code work under this section in, or with, or
specifically for use in, a User Product, and the conveying occurs as
part of a transaction in which the right of possession and use of the
User Product is transferred to the recipient in perpetuity or for a
fixed term (regardless of how the transaction is characterized), the
Corresponding Source conveyed under this section must be accompanied
by the Installation Information. But this requirement does not apply
if neither you nor any third party retains the ability to install
modified object code on the User Product (for example, the work has
been installed in ROM).</p>
<p>The requirement to provide Installation Information does not include a
requirement to continue to provide support service, warranty, or updates
for a work that has been modified or installed by the recipient, or for
the User Product in which it has been modified or installed. Access to a
network may be denied when the modification itself materially and
adversely affects the operation of the network or violates the rules and
protocols for communication across the network.</p>
<p>Corresponding Source conveyed, and Installation Information provided,
in accord with this section must be in a format that is publicly
documented (and with an implementation available to the public in
source code form), and must require no special password or key for
unpacking, reading or copying.</p>
<h4><a name="section7"></a>7. Additional Terms.</h4>
<p>&ldquo;Additional permissions&rdquo; are terms that supplement the terms of this
License by making exceptions from one or more of its conditions.
Additional permissions that are applicable to the entire Program shall
be treated as though they were included in this License, to the extent
that they are valid under applicable law. If additional permissions
apply only to part of the Program, that part may be used separately
under those permissions, but the entire Program remains governed by
this License without regard to the additional permissions.</p>
<p>When you convey a copy of a covered work, you may at your option
remove any additional permissions from that copy, or from any part of
it. (Additional permissions may be written to require their own
removal in certain cases when you modify the work.) You may place
additional permissions on material, added by you to a covered work,
for which you have or can give appropriate copyright permission.</p>
<p>Notwithstanding any other provision of this License, for material you
add to a covered work, you may (if authorized by the copyright holders of
that material) supplement the terms of this License with terms:</p>
<ul>
<li>a) Disclaiming warranty or limiting liability differently from the
terms of sections 15 and 16 of this License; or</li>
<li>b) Requiring preservation of specified reasonable legal notices or
author attributions in that material or in the Appropriate Legal
Notices displayed by works containing it; or</li>
<li>c) Prohibiting misrepresentation of the origin of that material, or
requiring that modified versions of such material be marked in
reasonable ways as different from the original version; or</li>
<li>d) Limiting the use for publicity purposes of names of licensors or
authors of the material; or</li>
<li>e) Declining to grant rights under trademark law for use of some
trade names, trademarks, or service marks; or</li>
<li>f) Requiring indemnification of licensors and authors of that
material by anyone who conveys the material (or modified versions of
it) with contractual assumptions of liability to the recipient, for
any liability that these contractual assumptions directly impose on
those licensors and authors.</li>
</ul>
<p>All other non-permissive additional terms are considered &ldquo;further
restrictions&rdquo; within the meaning of section 10. If the Program as you
received it, or any part of it, contains a notice stating that it is
governed by this License along with a term that is a further
restriction, you may remove that term. If a license document contains
a further restriction but permits relicensing or conveying under this
License, you may add to a covered work material governed by the terms
of that license document, provided that the further restriction does
not survive such relicensing or conveying.</p>
<p>If you add terms to a covered work in accord with this section, you
must place, in the relevant source files, a statement of the
additional terms that apply to those files, or a notice indicating
where to find the applicable terms.</p>
<p>Additional terms, permissive or non-permissive, may be stated in the
form of a separately written license, or stated as exceptions;
the above requirements apply either way.</p>
<h4><a name="section8"></a>8. Termination.</h4>
<p>You may not propagate or modify a covered work except as expressly
provided under this License. Any attempt otherwise to propagate or
modify it is void, and will automatically terminate your rights under
this License (including any patent licenses granted under the third
paragraph of section 11).</p>
<p>However, if you cease all violation of this License, then your
license from a particular copyright holder is reinstated (a)
provisionally, unless and until the copyright holder explicitly and
finally terminates your license, and (b) permanently, if the copyright
holder fails to notify you of the violation by some reasonable means
prior to 60 days after the cessation.</p>
<p>Moreover, your license from a particular copyright holder is
reinstated permanently if the copyright holder notifies you of the
violation by some reasonable means, this is the first time you have
received notice of violation of this License (for any work) from that
copyright holder, and you cure the violation prior to 30 days after
your receipt of the notice.</p>
<p>Termination of your rights under this section does not terminate the
licenses of parties who have received copies or rights from you under
this License. If your rights have been terminated and not permanently
reinstated, you do not qualify to receive new licenses for the same
material under section 10.</p>
<h4><a name="section9"></a>9. Acceptance Not Required for Having Copies.</h4>
<p>You are not required to accept this License in order to receive or
run a copy of the Program. Ancillary propagation of a covered work
occurring solely as a consequence of using peer-to-peer transmission
to receive a copy likewise does not require acceptance. However,
nothing other than this License grants you permission to propagate or
modify any covered work. These actions infringe copyright if you do
not accept this License. Therefore, by modifying or propagating a
covered work, you indicate your acceptance of this License to do so.</p>
<h4><a name="section10"></a>10. Automatic Licensing of Downstream Recipients.</h4>
<p>Each time you convey a covered work, the recipient automatically
receives a license from the original licensors, to run, modify and
propagate that work, subject to this License. You are not responsible
for enforcing compliance by third parties with this License.</p>
<p>An &ldquo;entity transaction&rdquo; is a transaction transferring control of an
organization, or substantially all assets of one, or subdividing an
organization, or merging organizations. If propagation of a covered
work results from an entity transaction, each party to that
transaction who receives a copy of the work also receives whatever
licenses to the work the party\'s predecessor in interest had or could
give under the previous paragraph, plus a right to possession of the
Corresponding Source of the work from the predecessor in interest, if
the predecessor has it or can get it with reasonable efforts.</p>
<p>You may not impose any further restrictions on the exercise of the
rights granted or affirmed under this License. For example, you may
not impose a license fee, royalty, or other charge for exercise of
rights granted under this License, and you may not initiate litigation
(including a cross-claim or counterclaim in a lawsuit) alleging that
any patent claim is infringed by making, using, selling, offering for
sale, or importing the Program or any portion of it.</p>
<h4><a name="section11"></a>11. Patents.</h4>
<p>A &ldquo;contributor&rdquo; is a copyright holder who authorizes use under this
License of the Program or a work on which the Program is based. The
work thus licensed is called the contributor\'s &ldquo;contributor version&rdquo;.</p>
<p>A contributor\'s &ldquo;essential patent claims&rdquo; are all patent claims
owned or controlled by the contributor, whether already acquired or
hereafter acquired, that would be infringed by some manner, permitted
by this License, of making, using, or selling its contributor version,
but do not include claims that would be infringed only as a
consequence of further modification of the contributor version. For
purposes of this definition, &ldquo;control&rdquo; includes the right to grant
patent sublicenses in a manner consistent with the requirements of
this License.</p>
<p>Each contributor grants you a non-exclusive, worldwide, royalty-free
patent license under the contributor\'s essential patent claims, to
make, use, sell, offer for sale, import and otherwise run, modify and
propagate the contents of its contributor version.</p>
<p>In the following three paragraphs, a &ldquo;patent license&rdquo; is any express
agreement or commitment, however denominated, not to enforce a patent
(such as an express permission to practice a patent or covenant not to
sue for patent infringement). To &ldquo;grant&rdquo; such a patent license to a
party means to make such an agreement or commitment not to enforce a
patent against the party.</p>
<p>If you convey a covered work, knowingly relying on a patent license,
and the Corresponding Source of the work is not available for anyone
to copy, free of charge and under the terms of this License, through a
publicly available network server or other readily accessible means,
then you must either (1) cause the Corresponding Source to be so
available, or (2) arrange to deprive yourself of the benefit of the
patent license for this particular work, or (3) arrange, in a manner
consistent with the requirements of this License, to extend the patent
license to downstream recipients. &ldquo;Knowingly relying&rdquo; means you have
actual knowledge that, but for the patent license, your conveying the
covered work in a country, or your recipient\'s use of the covered work
in a country, would infringe one or more identifiable patents in that
country that you have reason to believe are valid.</p>
<p>If, pursuant to or in connection with a single transaction or
arrangement, you convey, or propagate by procuring conveyance of, a
covered work, and grant a patent license to some of the parties
receiving the covered work authorizing them to use, propagate, modify
or convey a specific copy of the covered work, then the patent license
you grant is automatically extended to all recipients of the covered
work and works based on it.</p>
<p>A patent license is &ldquo;discriminatory&rdquo; if it does not include within
the scope of its coverage, prohibits the exercise of, or is
conditioned on the non-exercise of one or more of the rights that are
specifically granted under this License. You may not convey a covered
work if you are a party to an arrangement with a third party that is
in the business of distributing software, under which you make payment
to the third party based on the extent of your activity of conveying
the work, and under which the third party grants, to any of the
parties who would receive the covered work from you, a discriminatory
patent license (a) in connection with copies of the covered work
conveyed by you (or copies made from those copies), or (b) primarily
for and in connection with specific products or compilations that
contain the covered work, unless you entered into that arrangement,
or that patent license was granted, prior to 28 March 2007.</p>
<p>Nothing in this License shall be construed as excluding or limiting
any implied license or other defenses to infringement that may
otherwise be available to you under applicable patent law.</p>
<h4><a name="section12"></a>12. No Surrender of Others\' Freedom.</h4>
<p>If conditions are imposed on you (whether by court order, agreement or
otherwise) that contradict the conditions of this License, they do not
excuse you from the conditions of this License. If you cannot convey a
covered work so as to satisfy simultaneously your obligations under this
License and any other pertinent obligations, then as a consequence you may
not convey it at all. For example, if you agree to terms that obligate you
to collect a royalty for further conveying from those to whom you convey
the Program, the only way you could satisfy both those terms and this
License would be to refrain entirely from conveying the Program.</p>
<h4><a name="section13"></a>13. Use with the GNU Affero General Public License.</h4>
<p>Notwithstanding any other provision of this License, you have
permission to link or combine any covered work with a work licensed
under version 3 of the GNU Affero General Public License into a single
combined work, and to convey the resulting work. The terms of this
License will continue to apply to the part which is the covered work,
but the special requirements of the GNU Affero General Public License,
section 13, concerning interaction through a network will apply to the
combination as such.</p>
<h4><a name="section14"></a>14. Revised Versions of this License.</h4>
<p>The Free Software Foundation may publish revised and/or new versions of
the GNU General Public License from time to time. Such new versions will
be similar in spirit to the present version, but may differ in detail to
address new problems or concerns.</p>
<p>Each version is given a distinguishing version number. If the
Program specifies that a certain numbered version of the GNU General
Public License &ldquo;or any later version&rdquo; applies to it, you have the
option of following the terms and conditions either of that numbered
version or of any later version published by the Free Software
Foundation. If the Program does not specify a version number of the
GNU General Public License, you may choose any version ever published
by the Free Software Foundation.</p>
<p>If the Program specifies that a proxy can decide which future
versions of the GNU General Public License can be used, that proxy\'s
public statement of acceptance of a version permanently authorizes you
to choose that version for the Program.</p>
<p>Later license versions may give you additional or different
permissions. However, no additional obligations are imposed on any
author or copyright holder as a result of your choosing to follow a
later version.</p>
<h4><a name="section15"></a>15. Disclaimer of Warranty.</h4>
<p>THERE IS NO WARRANTY FOR THE PROGRAM, TO THE EXTENT PERMITTED BY
APPLICABLE LAW. EXCEPT WHEN OTHERWISE STATED IN WRITING THE COPYRIGHT
HOLDERS AND/OR OTHER PARTIES PROVIDE THE PROGRAM &ldquo;AS IS&rdquo; WITHOUT WARRANTY
OF ANY KIND, EITHER EXPRESSED OR IMPLIED, INCLUDING, BUT NOT LIMITED TO,
THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR
PURPOSE. THE ENTIRE RISK AS TO THE QUALITY AND PERFORMANCE OF THE PROGRAM
IS WITH YOU. SHOULD THE PROGRAM PROVE DEFECTIVE, YOU ASSUME THE COST OF
ALL NECESSARY SERVICING, REPAIR OR CORRECTION.</p>
<h4><a name="section16"></a>16. Limitation of Liability.</h4>
<p>IN NO EVENT UNLESS REQUIRED BY APPLICABLE LAW OR AGREED TO IN WRITING
WILL ANY COPYRIGHT HOLDER, OR ANY OTHER PARTY WHO MODIFIES AND/OR CONVEYS
THE PROGRAM AS PERMITTED ABOVE, BE LIABLE TO YOU FOR DAMAGES, INCLUDING ANY
GENERAL, SPECIAL, INCIDENTAL OR CONSEQUENTIAL DAMAGES ARISING OUT OF THE
USE OR INABILITY TO USE THE PROGRAM (INCLUDING BUT NOT LIMITED TO LOSS OF
DATA OR DATA BEING RENDERED INACCURATE OR LOSSES SUSTAINED BY YOU OR THIRD
PARTIES OR A FAILURE OF THE PROGRAM TO OPERATE WITH ANY OTHER PROGRAMS),
EVEN IF SUCH HOLDER OR OTHER PARTY HAS BEEN ADVISED OF THE POSSIBILITY OF
SUCH DAMAGES.</p>
<h4><a name="section17"></a>17. Interpretation of Sections 15 and 16.</h4>
<p>If the disclaimer of warranty and limitation of liability provided
above cannot be given local legal effect according to their terms,
reviewing courts shall apply local law that most closely approximates
an absolute waiver of all civil liability in connection with the
Program, unless a warranty or assumption of liability accompanies a
copy of the Program in return for a fee.</p>
<p>END OF TERMS AND CONDITIONS</p>

View file

@ -0,0 +1,4 @@
# reduce verbose of some otherwise annoying ical4j messages
net.fortuna.ical4j.data.level = INFO
net.fortuna.ical4j.model.Recur.level = INFO

View file

@ -0,0 +1 @@
{"ar_SA":["abdunnasir"],"bg":["dpa_transifex"],"ca":["Kintu","jordibrus","zagur"],"cs":["pavelb","svetlemodry","tomas.odehnal"],"da":["Tntdruid_","knutztar","mjjzf","twikedk"],"de":["Atalanttore","TheName","Waldmeisda","Wyrrrd","YvanM","amandablue","anestiskaci","corppneq","crit12","hammaschlach","maxkl","nicolas_git","owncube"],"el":["KristinaQejvanaj","anestiskaci","diamond_gr"],"es":["Ark74","Elhea","GranPC","aluaces","jcvielma","plaguna","polkhas","xphnx"],"eu":["Osoitz","Thadah","cockeredradiation"],"fa":["Numb","ahangarha","amiraliakbari","joojoojoo","maryambehzi","mtashackori","taranehsaei"],"fr":["AlainR","Amadeen","Floflr","JorisBodin","Llorc","LoiX07","Novick","Poussinou","Thecross","YvanM","alkino2","boutil","callmemagnus","chrcha","grenatrad","jokx","mathieugfortin","paullbn","vincen","ÉricB."],"fr_FR":["Llorc","Poussinou","chrcha"],"gl":["aluaces"],"hu":["Roshek","infeeeee","jtg"],"it":["Damtux","FranzMari","ed0","malaerba","noccio","nwandy","rickyroo","technezio"],"it_IT":["malaerba"],"ja":["Naofumi","yanorei32"],"nb_NO":["elonus"],"nl":["XtremeNova","davtemp","dehart","erikhubers","frankyboy1963"],"pl":["TORminator","TheName","Valdnet","gsz","mg6","oskarjakiela"],"pt":["amalvarenga","wanderlei.huttel"],"pt_BR":["wanderlei.huttel"],"ru":["aigoshin","anm","ashed","astalavister","nick.savin","vaddd"],"sk_SK":["brango67","tiborepcek"],"sl_SI":["MrLaaky","uroszor"],"sr":["daimonion"],"sv":["Mikaelb","campbelldavid"],"szl":["chlodny"],"tr_TR":["ooguz","pultars"],"uk":["androsua","olexn","twixi007"],"uk_UA":["astalavister"],"zh_CN":["anolir","jxj2zzz79pfp9bpo","linuxbckp","mofitt2016","oksjd","phy","spice2wolf"],"zh_TW":["linuxbckp","mofitt2016","phy","waiabsfabuloushk"]}

Binary file not shown.

After

Width:  |  Height:  |  Size: 23 KiB

View file

@ -0,0 +1,83 @@
/*
* Copyright © All Contributors. See LICENSE and AUTHORS in the root directory for details.
*/
package at.bitfire.davdroid
import android.app.Application
import androidx.hilt.work.HiltWorkerFactory
import androidx.work.Configuration
import at.bitfire.davdroid.di.DefaultDispatcher
import at.bitfire.davdroid.log.LogManager
import at.bitfire.davdroid.startup.StartupPlugin
import at.bitfire.davdroid.sync.account.AccountsCleanupWorker
import at.bitfire.davdroid.ui.UiUtils
import dagger.hilt.android.HiltAndroidApp
import kotlinx.coroutines.CoroutineDispatcher
import kotlinx.coroutines.DelicateCoroutinesApi
import kotlinx.coroutines.GlobalScope
import kotlinx.coroutines.launch
import java.util.logging.Logger
import javax.inject.Inject
@HiltAndroidApp
class App: Application(), Configuration.Provider {
@Inject
lateinit var logger: Logger
/**
* Creates the [LogManager] singleton and thus initializes logging.
*/
@Inject
lateinit var logManager: LogManager
@Inject
@DefaultDispatcher
lateinit var defaultDispatcher: CoroutineDispatcher
@Inject
lateinit var plugins: Set<@JvmSuppressWildcards StartupPlugin>
@Inject
lateinit var workerFactory: HiltWorkerFactory
override val workManagerConfiguration: Configuration
get() = Configuration.Builder()
.setWorkerFactory(workerFactory)
.build()
override fun onCreate() {
super.onCreate()
logger.fine("Logging using LogManager $logManager")
// set light/dark mode
UiUtils.updateTheme(this) // when this is called in the asynchronous thread below, it recreates
// some current activity and causes an IllegalStateException in rare cases
// run startup plugins (sync)
for (plugin in plugins.sortedBy { it.priority() }) {
logger.fine("Running startup plugin: $plugin (onAppCreate)")
plugin.onAppCreate()
}
// don't block UI for some background checks
@OptIn(DelicateCoroutinesApi::class)
GlobalScope.launch(defaultDispatcher) {
// clean up orphaned accounts in DB from time to time
AccountsCleanupWorker.enable(this@App)
// create/update app shortcuts
UiUtils.updateShortcuts(this@App)
// run startup plugins (async)
for (plugin in plugins.sortedBy { it.priorityAsync() }) {
logger.fine("Running startup plugin: $plugin (onAppCreateAsync)")
plugin.onAppCreateAsync()
}
}
}
}

Some files were not shown because too many files have changed in this diff Show more