1
0
mirror of https://github.com/google/nomulus synced 2026-02-02 19:12:27 +00:00

Compare commits

...

31 Commits

Author SHA1 Message Date
Ben McIlwain
8c04bf2599 Rename InjectRule and remove some JUnit4-only dependencies (#726)
* Rename InjectRule and remove some deps

* Merge remote-tracking branch 'upstream/master' into final-core-tests
2020-07-29 20:27:47 -04:00
Ben McIlwain
34116e3811 Clarify un-avail reason on allocation-token-reserved domains (#725)
Apparently, in domain check responses, `avail=false, reason=Allocation token
required` was not sufficiently understood by all registrars. This changes it to
`avail=false, reason=Reserved; alloc. token required` to hopefully make it
crystal clear that the domain in question is reserved, i.e. if you were supposed
to be able to register this domain you'd already know it because we'd have
already given you the requisite allocation token.
2020-07-29 17:13:38 -04:00
Lai Jiang
d180ef43ac Move the documentation package to its own subproject (#722)
This makes it easier to later migrate the package to Java 11. If we move
and migrate in a single PR, because of the portion of the contents that
s changed, git will have trouble recognizing that some files are
renamed *and* modified and treat them as distinct files, making code
review difficult.
2020-07-29 13:41:02 -04:00
Michael Muller
f55270c46f Integrate transaction persistence into JpaTM (#717)
* Integrate transaction persistence into JpaTM

Store the serialized transaction whenever we commit from the JPA transaction
manager.  This change also adds:

-   The Transaction table.
-   The TransactionEntity which is stored in it.
-   Changes to the test infrastructure to register the TransactionEntity for
    tests where we don't load the nomulus schema.
-   A new configuration variable to allow us to turn the transaction
    persistence functionality on and off (default is "off").

* Changes for review.

* Incremented sequence number of flyway file
2020-07-28 19:23:44 -04:00
Ben McIlwain
d6d9874da1 Upgrade App Engine and webserver tests from JUnit 4 to 5 (#720)
* Upgrade App Engine and webserver tests from JUnit 4 to 5

* Fix most errors

* Merge branch 'master' into junit5ification

* Fix test server by extracting non-test setup/tear-down

* Merge branch 'master' into junit5ification

* Fix backup tests

* Don't createFile(); asCharSink does it

* Increase the timeout for all WebDriver tests to 60s (helps w/ flakiness)
2020-07-28 14:18:16 -04:00
gbrodman
e0d04cec4f Set up deployment of the Spec11 pipeline with JPA TM (#716)
* Set up deployment of the Spec11 pipeline with JPA TM

* Remove unnecessarily pipeline options setting

* Use enviroment name in BeamJpaModuleTest

* Fix checkstyle error
2020-07-27 21:04:52 -04:00
Michael Muller
0ce431212e Add the :nom:generate_golden_schema pseudo-task (#718)
Add a "pseudo-task" in nom_build to do the three step process of generating
the golden schema.  In the course of this, add support for pseudo-tasks in
general, improve the database directory readme and make nom_build not call
gradlew if there are no tasks.
2020-07-27 18:33:16 -04:00
gbrodman
32868b3ab8 Run the (Un)lockDomainCommand in an outer JPA txn (#688)
* Run the (Un)lockDomainCommand in an outer JPA txn

There are a couple things going on here in this commit.

First, we add an external JPA transaction in the
LockOrUnlockDomainCommand class. This doesn't appear to do much, but it
avoids a situation similar to deadlock if an error occurs in Datastore
when saving the domain object. Specifically, DomainLockUtils relies on
the fact that any error in Datastore will be re-thrown in the JPA
transaction, meaning that any Datastore error will back out of the SQL
transaction as well. However, this is no longer true if we are already
in a Datastore transaction when calling DomainLockUtils (unless, again,
we are also in a JPA transaction). Basically, we require that the outer
transaction is the JPA one.

Secondly, this just allows for more breakglass operations in the lock or
unlock domain commands -- in a situation where things possibly go
haywire, we should allow admins to make sure with certainty that a
domain is locked or unlocked.

* Add more robustness and tests for admins locking locked domains

* Fix expected exception message in tests
2020-07-27 18:16:24 -04:00
Shicong Huang
0ecc20b48c Rename a V40 flyway file to V41 to resolve conflict (#719) 2020-07-27 15:16:01 -04:00
Shicong Huang
c65af4b480 Add remaining columns to Domain's SQL schema (#702) 2020-07-27 13:32:39 -04:00
Legina Chen
3a15a8bdc7 Drop foreign key constraint for Registrar table (#715) 2020-07-27 09:05:30 -07:00
Weimin Yu
9806fab880 Use rearranged sql credentials in flyway task (#712)
* Use rearranged sql credentials in flyway task

Let the flyway tasks use the sql credential files set up for BEAM
pipelines.

Credential files have been created for each environment in GCS
at gs://${project}-beam/cloudsql/admin_credential.enc. All
project editors have access to this file, including the Dataflow
control service account.

Alpha and crash use the 'nomulus-tools-key' in their own project to
decrypt the credential file.

Sandbox and production use the 'nomulus-tools-key' in
domain-registry-dev to decrypt the credential file.

Note that this setup is temporary. It will become obsolete once
we migrate to Cloud Secret Manager for secret storage.
2020-07-24 15:32:01 -04:00
Weimin Yu
6591e0672a End-to-end Datastore to SQL pipeline (#707)
* End-to-end Datastore to SQL pipeline

Defined InitSqlPipeline that performs end-to-end migration from
a Datastore backup to a SQL database.

Also fixed/refined multiple tests related to this migration.
2020-07-24 09:57:43 -04:00
Ben McIlwain
91b7d92cf8 Upgrade TestPipeline extension from JUnit 4 to 5 2020-07-23 21:21:58 -04:00
Ben McIlwain
33910613da Get presubmits passing
This involves Guava -> Java 8 util migrations and fixing the license header.
2020-07-23 21:21:58 -04:00
Ben McIlwain
1fde678250 Copy TestPipeline rule from Apache Beam project into our codebase
This is copied in here with the absolute minimum # of modifications required
(just a rename to JUnit 5 format and some small fixes required to enable
compilation to be successful).

This is in preparation for the next commit where I'll convert this Rule into a
JUnit 5 extension, which is the entire goal here. But I wanted to get the code
from Apache Beam in with the maximum possible fidelity so that my changes will
be in a separate commit and will thus be obvious.

Note that we do unfortunately need to modify/rewrite the Rule itself; merely
wrapping it in some manner isn't possible.
2020-07-23 21:21:58 -04:00
gbrodman
8d56577653 Don't run presubmits over the .git folder (#711) 2020-07-23 18:12:34 -04:00
Ben McIlwain
3891d411de Upgrade most of remaining tests from JUnit 4 to JUnit 5 (#708) 2020-07-23 15:43:59 -04:00
gbrodman
cadecb15d8 Rename the email field in UI and include rlock email if it exists (#697)
* Rename the email field in UI and include rlock email if it exists

* Change the capitalization of fields and titles and add a description
2020-07-23 14:30:12 -04:00
gbrodman
9b7f6ce500 Fix some SQL credential issues identified when deploying Beam pipelines (#706)
* Fix some SQL credential issues identified when deploying Beam pipelines

There are two issues fixed here.
1. Without calling `FileSystems.setDefaultPipelineOptions(PipelineOptionsFactory.create()), the Nomulus tool doesn't know how to handle gs:// scheme files. Thus, if you try to deploy (for instance) the Spec11 pipeline using a GCS credential file, it fails.
2. There was a misunderstanding before about what the credential file
actually refers to -- there is a credential file in JSON format that is
used for gcloud authorization, and there is a space-delimited SQL access
info file that has the instance name, username, and password. These are
separate options and should have separate command-line params.

* Actually we don't need this for remote deployment
2020-07-22 16:52:31 -04:00
Ben McIlwain
cd23748fe8 Upgrade rest of tools test classes to JUnit 5 (#705) 2020-07-22 11:09:21 -04:00
Ben McIlwain
cf41f5d354 Upgrade all remaining flows tests to JUnit 5 (#704) 2020-07-21 19:52:33 -04:00
Ben McIlwain
9a5ba249db Upgrade converters/TMCH/RDAP to JUnit 5 (#703)
Also renames some existing Rules to Extensions (and removes JUnit 4 features
from them entirely if no longer being used).
2020-07-21 18:48:41 -04:00
Shicong Huang
f5186f8476 Merge two PremiumList entities (#690) 2020-07-21 18:18:52 -04:00
Lai Jiang
4e0ca19d2e Remove IDN elements from BRDA (#670)
Also added unit tests for RdeStagingReducer.
2020-07-21 15:29:32 -04:00
Ben McIlwain
c812807ab3 Upgrade mapreduce and DNS tests from JUnit 4 to JUnit 5 (#701)
* Upgrade mapreduce and DNS tests from JUnit 4 to JUnit 5

* Merge branch 'master' into junit5-batch-and-dns
2020-07-20 21:33:24 -04:00
Ben McIlwain
9edb43f3e4 Upgrade command test classes from JUnit 4 to JUnit 5 (#700)
* Convert first batch of command tests to JUnit 5

* Upgrade rest of command tests to JUnit 5

* Migrate the last few test classes
2020-07-20 20:45:52 -04:00
gbrodman
b721533759 Create an ImmutableObjectSubject for comparing SQL objects (#695)
* Create an ImmutableObjectSubject for comparing SQL objects

Many times, when comparing objects that are loaded in from / saved to
SQL in tests, there are some fields we don't care about. Specifically,
we might not care about the last update time, revision ID, or other
things like that that are autoassigned by the DB. If we use this, we can
ignore those fields while still comparing the other ones.

* Create an ImmutableObject Correspondence for more flexible usage
2020-07-20 13:14:09 -04:00
gbrodman
ce35f6bc93 Include the user's registry lock email in the lock/unlock modal (#696)
* Include the user's registry lock email in the lock/unlock modal
2020-07-20 12:01:34 -04:00
gbrodman
f7a67b7676 Add a 'Host' parameter to the relock action enqueuer (#699)
* Add a 'Host' parameter to the relock action enqueuer

I believe this is why we are seeing 404s currently -- we should be
specifying the backend host as the target like we do for the
resave-entity async action.
2020-07-17 15:35:44 -04:00
gbrodman
4438944900 Validate potentially-invalid domain names when (un)locking domains (#698)
* Validate potentially-invalid domain names when (un)locking domains
2020-07-17 12:05:19 -04:00
589 changed files with 11304 additions and 9085 deletions

View File

@@ -1,4 +0,0 @@
# This is a Gradle generated file for dependency locking.
# Manual edits can break the build and are not advised.
# This file is expected to be part of source control.
com.google.errorprone:javac:9+181-r4173-1

View File

@@ -16,6 +16,7 @@ package google.registry.testing.truth;
import static com.google.common.base.Preconditions.checkNotNull;
import static com.google.common.truth.Truth.assertAbout;
import static com.google.common.truth.Truth.assertWithMessage;
import static java.nio.charset.StandardCharsets.UTF_8;
import com.github.difflib.DiffUtils;
@@ -31,6 +32,7 @@ import com.google.common.collect.ImmutableList;
import com.google.common.io.Resources;
import com.google.common.truth.Fact;
import com.google.common.truth.FailureMetadata;
import com.google.common.truth.SimpleSubjectBuilder;
import com.google.common.truth.Subject;
import java.io.IOException;
import java.net.URL;
@@ -68,6 +70,15 @@ public class TextDiffSubject extends Subject {
this.actual = ImmutableList.copyOf(actual);
}
protected TextDiffSubject(FailureMetadata metadata, URL actual) {
super(metadata, actual);
try {
this.actual = ImmutableList.copyOf(Resources.asCharSource(actual, UTF_8).readLines());
} catch (IOException e) {
throw new RuntimeException(e);
}
}
public TextDiffSubject withDiffFormat(DiffFormat format) {
this.diffFormat = format;
return this;
@@ -100,6 +111,11 @@ public class TextDiffSubject extends Subject {
return assertThat(Resources.asCharSource(resourceUrl, UTF_8).readLines());
}
public static SimpleSubjectBuilder<TextDiffSubject, URL> assertWithMessageAboutUrlSource(
String format, Object... params) {
return assertWithMessage(format, params).about(urlFactory());
}
private static final Subject.Factory<TextDiffSubject, ImmutableList<String>>
TEXT_DIFF_SUBJECT_TEXT_FACTORY = TextDiffSubject::new;
@@ -107,6 +123,13 @@ public class TextDiffSubject extends Subject {
return TEXT_DIFF_SUBJECT_TEXT_FACTORY;
}
private static final Subject.Factory<TextDiffSubject, URL> TEXT_DIFF_SUBJECT_URL_FACTORY =
TextDiffSubject::new;
public static Subject.Factory<TextDiffSubject, URL> urlFactory() {
return TEXT_DIFF_SUBJECT_URL_FACTORY;
}
static String generateUnifiedDiff(
ImmutableList<String> expectedContent, ImmutableList<String> actualContent) {
Patch<String> diff;

View File

@@ -19,6 +19,7 @@ import argparse
import attr
import io
import os
import shutil
import subprocess
import sys
from typing import List, Union
@@ -49,15 +50,30 @@ PROPERTIES_HEADER = """\
# This file defines properties used by the gradle build. It must be kept in
# sync with config/nom_build.py.
#
# To regenerate, run config/nom_build.py --generate-gradle-properties
# To regenerate, run ./nom_build --generate-gradle-properties
#
# To view property descriptions (which are command line flags for
# nom_build), run config/nom_build.py --help.
# nom_build), run ./nom_build --help.
#
# DO NOT EDIT THIS FILE BY HAND
org.gradle.jvmargs=-Xmx1024m
"""
# Help text to be displayed (in addition to the synopsis and flag help, which
# are displayed automatically).
HELP_TEXT = """\
A wrapper around the gradle build that provides the following features:
- Converts properties into flags to guard against property name spelling errors
and to provide help descriptions for all properties.
- Provides pseudo-commands (with the ":nom:" prefix) that encapsulate common
actions that are difficult to implement in gradle.
Pseudo-commands:
:nom:generate_golden_file - regenerates the golden file from the current
set of flyway files.
"""
# Define all of our special gradle properties here.
PROPERTIES = [
Property('mavenUrl',
@@ -114,6 +130,11 @@ PROPERTIES = [
Property('nomulus_version',
'The version of nomulus to test against in a database '
'integration test.'),
Property('dot_path',
'The path to "dot", part of the graphviz package that converts '
'a BEAM pipeline to image. Setting this property to empty string '
'will disable image generation.',
'/usr/bin/dot'),
]
GRADLE_FLAGS = [
@@ -251,8 +272,42 @@ def get_root() -> str:
return cur_dir
def main(args):
parser = argparse.ArgumentParser('nom_build')
class Abort(Exception):
"""Raised to terminate the process with a non-zero error code.
Parameters are ignored.
"""
def do_pseudo_task(task: str) -> None:
root = get_root()
if task == ':nom:generate_golden_file':
if not subprocess.call([f'{root}/gradlew', ':db:test']):
print('\033[33mWARNING:\033[0m Golden schema appears to be '
'up-to-date. If you are making schema changes, be sure to '
'add a flyway file for them.')
return
print('\033[33mWARNING:\033[0m Ignore the above failure, it is '
'expected.')
# Copy the new schema into place.
shutil.copy(f'{root}/db/build/resources/test/testcontainer/'
'mount/dump.txt',
f'{root}/db/src/main/resources/sql/schema/'
'nomulus.golden.sql')
if subprocess.call([f'{root}/gradlew', ':db:test']):
print('\033[31mERROR:\033[0m Golden file test failed after '
'copying schema. Please check your flyway files.')
raise Abort()
else:
print(f'\033[31mERROR:\033[0m Unknown task {task}')
raise Abort()
def main(args) -> int:
parser = argparse.ArgumentParser('nom_build', description=HELP_TEXT,
formatter_class=argparse.RawTextHelpFormatter)
for prop in PROPERTIES:
parser.add_argument('--' + prop.name, default=prop.default,
help=prop.desc)
@@ -291,7 +346,7 @@ def main(args):
if args.generate_gradle_properties:
with open(f'{root}/gradle.properties', 'w') as dst:
dst.write(gradle_properties)
return
return 0
# Verify that the gradle properties file is what we expect it to be.
with open(f'{root}/gradle.properties') as src:
@@ -316,12 +371,39 @@ def main(args):
if flag.has_arg:
gradle_command.append(arg_val)
# See if there are any special ":nom:" pseudo-tasks specified.
got_non_pseudo_tasks = False
for arg in args.non_flag_args[1:]:
if arg.startswith(':nom:'):
if got_non_pseudo_tasks:
# We can't currently deal with the situation of gradle tasks
# before pseudo-tasks. This could be implemented by invoking
# gradle for only the set of gradle tasks before the pseudo
# task, but that's overkill for now.
print(f'\033[31mERROR:\033[0m Pseudo task ({arg}) must be '
'specified prior to all actual gradle tasks. Aborting.')
return 1
do_pseudo_task(arg)
else:
got_non_pseudo_tasks = True
non_flag_args = [
arg for arg in args.non_flag_args[1:] if not arg.startswith(':nom:')]
if not non_flag_args:
if not got_non_pseudo_tasks:
print('\033[33mWARNING:\033[0m No tasks specified. Not '
'doing anything')
return 0
# Add the non-flag args (we exclude the first, which is the command name
# itself) and run.
gradle_command.extend(args.non_flag_args[1:])
subprocess.call(gradle_command)
gradle_command.extend(non_flag_args)
return subprocess.call(gradle_command)
if __name__ == '__main__':
main(sys.argv)
try:
sys.exit(main(sys.argv))
except Abort as ex:
sys.exit(1)

View File

@@ -14,6 +14,7 @@
import io
import os
import shutil
import unittest
from unittest import mock
import nom_build
@@ -67,6 +68,7 @@ class MyTest(unittest.TestCase):
mock.patch.object(nom_build, 'print', self.print_fake).start())
self.call_mock = mock.patch.object(subprocess, 'call').start()
self.copy_mock = mock.patch.object(shutil, 'copy').start()
self.file_contents = {
# Prefil with the actual file contents.
@@ -92,17 +94,32 @@ class MyTest(unittest.TestCase):
def test_no_args(self):
nom_build.main(['nom_build'])
self.assertEqual(self.printed, [])
self.call_mock.assert_called_with([GRADLEW])
self.assertEqual(self.printed,
['\x1b[33mWARNING:\x1b[0m No tasks specified. Not '
'doing anything'])
def test_property_calls(self):
nom_build.main(['nom_build', '--testFilter=foo'])
self.call_mock.assert_called_with([GRADLEW, '-P', 'testFilter=foo'])
nom_build.main(['nom_build', 'task-name', '--testFilter=foo'])
self.call_mock.assert_called_with([GRADLEW, '-P', 'testFilter=foo',
'task-name'])
def test_gradle_flags(self):
nom_build.main(['nom_build', '-d', '-b', 'foo'])
nom_build.main(['nom_build', 'task-name', '-d', '-b', 'foo'])
self.call_mock.assert_called_with([GRADLEW, '--build-file', 'foo',
'--debug'])
'--debug', 'task-name'])
def test_generate_golden_file(self):
self.call_mock.side_effect = [1, 0]
nom_build.main(['nom_build', ':nom:generate_golden_file'])
self.call_mock.assert_has_calls([
mock.call([GRADLEW, ':db:test']),
mock.call([GRADLEW, ':db:test'])
])
def test_generate_golden_file_nofail(self):
self.call_mock.return_value = 0
nom_build.main(['nom_build', ':nom:generate_golden_file'])
self.call_mock.assert_has_calls([mock.call([GRADLEW, ':db:test'])])
unittest.main()

View File

@@ -22,7 +22,7 @@ import sys
import re
# We should never analyze any generated files
UNIVERSALLY_SKIPPED_PATTERNS = {"/build/", "cloudbuild-caches", "/out/"}
UNIVERSALLY_SKIPPED_PATTERNS = {"/build/", "cloudbuild-caches", "/out/", ".git/"}
# We can't rely on CI to have the Enum package installed so we do this instead.
FORBIDDEN = 1
REQUIRED = 2
@@ -93,20 +93,20 @@ PRESUBMITS = {
PresubmitCheck(
r".*\bSystem\.(out|err)\.print", "java", {
"StackdriverDashboardBuilder.java", "/tools/", "/example/",
"RegistryTestServerMain.java", "TestServerRule.java",
"RegistryTestServerMain.java", "TestServerExtension.java",
"FlowDocumentationTool.java"
}):
"System.(out|err).println is only allowed in tools/ packages. Please "
"use a logger instead.",
# ObjectifyService.register is restricted to main/ or AppEngineRule.
# ObjectifyService.register is restricted to main/ or AppEngineExtension.
PresubmitCheck(
r".*\bObjectifyService\.register", "java", {
"/build/", "/generated/", "node_modules/", "src/main/",
"AppEngineRule.java"
"AppEngineExtension.java"
}):
"ObjectifyService.register is not allowed in tests. Please use "
"AppengineRule.register instead.",
"ObjectifyService.register(...) is not allowed in tests. Please use "
"AppEngineExtension.register(...) instead.",
# PostgreSQLContainer instantiation must specify docker tag
PresubmitCheck(

View File

@@ -82,11 +82,6 @@ sourceSets {
main {
java {
srcDirs += generatedDir
// Javadoc API is deprecated in Java 11 and removed in Java 12.
// TODO(jianglai): re-enable after migrating to the new Javadoc API
if ((JavaVersion.current().majorVersion as Integer) >= 11) {
exclude 'google/registry/documentation/**'
}
}
resources {
exclude '**/*.xjb'
@@ -238,6 +233,7 @@ dependencies {
compile deps['jline:jline']
compile deps['joda-time:joda-time']
compile deps['org.apache.avro:avro']
testCompile deps['org.apache.beam:beam-runners-core-construction-java']
testCompile deps['org.apache.beam:beam-runners-direct-java']
compile deps['org.apache.beam:beam-runners-google-cloud-dataflow-java']
compile deps['org.apache.beam:beam-sdks-java-core']
@@ -256,6 +252,7 @@ dependencies {
compile deps['org.bouncycastle:bcpg-jdk15on']
testCompile deps['org.bouncycastle:bcpkix-jdk15on']
compile deps['org.bouncycastle:bcprov-jdk15on']
testCompile deps['com.fasterxml.jackson.core:jackson-databind']
runtime deps['org.glassfish.jaxb:jaxb-runtime']
compile deps['org.hibernate:hibernate-core']
compile deps['org.joda:joda-money']
@@ -272,7 +269,6 @@ dependencies {
compile deps['org.testcontainers:postgresql']
testCompile deps['org.testcontainers:selenium']
testCompile deps['org.testcontainers:testcontainers']
testCompile deps['pl.pragmatists:JUnitParams']
compile deps['xerces:xmlParserAPIs']
compile deps['xpp3:xpp3']
// This dependency must come after javax.mail:mail as it would otherwise
@@ -313,9 +309,9 @@ dependencies {
testCompile deps['org.junit.jupiter:junit-jupiter-engine']
testCompile deps['org.junit.jupiter:junit-jupiter-migrationsupport']
testCompile deps['org.junit.jupiter:junit-jupiter-params']
testCompile deps['org.junit-pioneer:junit-pioneer']
testCompile deps['org.junit.platform:junit-platform-runner']
testCompile deps['org.junit.platform:junit-platform-suite-api']
testCompile deps['org.junit.vintage:junit-vintage-engine']
testCompile deps['org.mockito:mockito-core']
testCompile deps['org.mockito:mockito-junit-jupiter']
runtime deps['org.postgresql:postgresql']
@@ -850,22 +846,6 @@ task generateGoldenImages(type: FilteringTest) {
}
generateGoldenImages.finalizedBy(findGoldenImages)
task flowDocsTool(type: JavaExec) {
systemProperty 'test.projectRoot', rootProject.projectRootDir
systemProperty 'test.resourcesDir', resourcesDir
classpath = sourceSets.main.runtimeClasspath
main = 'google.registry.documentation.FlowDocumentationTool'
def arguments = []
if (rootProject.flowDocsFile) {
arguments << "--output_file=${rootProject.flowDocsFile}"
} else {
arguments << "--output_file=${rootProject.projectRootDir}/docs/flows.md"
}
args arguments
}
task standardTest(type: FilteringTest) {
includeAllTests()
exclude fragileTestPatterns
@@ -967,6 +947,49 @@ task buildToolImage(dependsOn: nomulus, type: Exec) {
commandLine 'docker', 'build', '-t', 'nomulus-tool', '.'
}
task generateInitSqlPipelineGraph(type: Test) {
include "**/InitSqlPipelineGraphTest.*"
testNameIncludePatterns = ["**createPipeline_compareGraph"]
ignoreFailures = true
}
task updateInitSqlPipelineGraph(type: Copy) {
def graphRelativePath = 'google/registry/beam/initsql/'
from ("${projectDir}/build/resources/test/${graphRelativePath}") {
include 'pipeline_curr.dot'
rename 'curr', 'golden'
}
into "src/test/resources/${graphRelativePath}"
dependsOn generateInitSqlPipelineGraph
doLast {
if (com.google.common.base.Strings.isNullOrEmpty(project.dot_path)) {
getLogger().info('Property dot_path is null. Not creating image for pipeline graph.')
}
def dotPath = project.dot_path
if (!new File(dotPath).exists()) {
throw new RuntimeException(
"""\
${dotPath} not found. Make sure graphviz is installed
and the dot_path property is set correctly."""
.stripIndent())
}
def goldenGraph = "src/test/resources/${graphRelativePath}/pipeline_golden.dot"
def goldenImage = "src/test/resources/${graphRelativePath}/pipeline_golden.png"
def cmd = "${dotPath} -Tpng -o \"${goldenImage}\" \"${goldenGraph}\""
try {
rootProject.ext.execInBash(cmd, projectDir)
} catch (Throwable throwable) {
throw new RuntimeException(
"""\
Failed to generate golden image with command ${cmd}
Error: ${throwable.getMessage()}
""")
}
}
}
// Build the devtool jar.
createUberJar(
'devtool',

View File

@@ -243,6 +243,7 @@ org.jboss:jandex:2.1.3.Final
org.jetbrains:annotations:19.0.0
org.joda:joda-money:1.0.1
org.json:json:20160810
org.junit-pioneer:junit-pioneer:0.7.0
org.junit.jupiter:junit-jupiter-api:5.6.2
org.junit.jupiter:junit-jupiter-engine:5.6.2
org.junit.jupiter:junit-jupiter-migrationsupport:5.6.2
@@ -252,7 +253,6 @@ org.junit.platform:junit-platform-engine:1.6.2
org.junit.platform:junit-platform-launcher:1.6.2
org.junit.platform:junit-platform-runner:1.6.2
org.junit.platform:junit-platform-suite-api:1.6.2
org.junit.vintage:junit-vintage-engine:5.6.2
org.junit:junit-bom:5.6.2
org.jvnet.staxex:stax-ex:1.8
org.mockito:mockito-core:3.3.3
@@ -292,6 +292,5 @@ org.tukaani:xz:1.8
org.w3c.css:sac:1.3
org.xerial.snappy:snappy-java:1.1.4
org.yaml:snakeyaml:1.17
pl.pragmatists:JUnitParams:1.1.1
xerces:xmlParserAPIs:2.6.2
xpp3:xpp3:1.1.4c

View File

@@ -241,6 +241,7 @@ org.jboss:jandex:2.1.3.Final
org.jetbrains:annotations:19.0.0
org.joda:joda-money:1.0.1
org.json:json:20160810
org.junit-pioneer:junit-pioneer:0.7.0
org.junit.jupiter:junit-jupiter-api:5.6.2
org.junit.jupiter:junit-jupiter-engine:5.6.2
org.junit.jupiter:junit-jupiter-migrationsupport:5.6.2
@@ -250,7 +251,6 @@ org.junit.platform:junit-platform-engine:1.6.2
org.junit.platform:junit-platform-launcher:1.6.2
org.junit.platform:junit-platform-runner:1.6.2
org.junit.platform:junit-platform-suite-api:1.6.2
org.junit.vintage:junit-vintage-engine:5.6.2
org.junit:junit-bom:5.6.2
org.jvnet.staxex:stax-ex:1.8
org.mockito:mockito-core:3.3.3
@@ -290,6 +290,5 @@ org.tukaani:xz:1.8
org.w3c.css:sac:1.3
org.xerial.snappy:snappy-java:1.1.4
org.yaml:snakeyaml:1.17
pl.pragmatists:JUnitParams:1.1.1
xerces:xmlParserAPIs:2.6.2
xpp3:xpp3:1.1.4c

View File

@@ -246,6 +246,7 @@ org.jboss:jandex:2.1.3.Final
org.jetbrains:annotations:19.0.0
org.joda:joda-money:1.0.1
org.json:json:20160810
org.junit-pioneer:junit-pioneer:0.7.0
org.junit.jupiter:junit-jupiter-api:5.6.2
org.junit.jupiter:junit-jupiter-engine:5.6.2
org.junit.jupiter:junit-jupiter-migrationsupport:5.6.2
@@ -255,7 +256,6 @@ org.junit.platform:junit-platform-engine:1.6.2
org.junit.platform:junit-platform-launcher:1.6.2
org.junit.platform:junit-platform-runner:1.6.2
org.junit.platform:junit-platform-suite-api:1.6.2
org.junit.vintage:junit-vintage-engine:5.6.2
org.junit:junit-bom:5.6.2
org.jvnet.staxex:stax-ex:1.8
org.mockito:mockito-core:3.3.3
@@ -296,6 +296,5 @@ org.tukaani:xz:1.8
org.w3c.css:sac:1.3
org.xerial.snappy:snappy-java:1.1.4
org.yaml:snakeyaml:1.17
pl.pragmatists:JUnitParams:1.1.1
xerces:xmlParserAPIs:2.6.2
xpp3:xpp3:1.1.4c

View File

@@ -246,6 +246,7 @@ org.jboss:jandex:2.1.3.Final
org.jetbrains:annotations:19.0.0
org.joda:joda-money:1.0.1
org.json:json:20160810
org.junit-pioneer:junit-pioneer:0.7.0
org.junit.jupiter:junit-jupiter-api:5.6.2
org.junit.jupiter:junit-jupiter-engine:5.6.2
org.junit.jupiter:junit-jupiter-migrationsupport:5.6.2
@@ -255,7 +256,6 @@ org.junit.platform:junit-platform-engine:1.6.2
org.junit.platform:junit-platform-launcher:1.6.2
org.junit.platform:junit-platform-runner:1.6.2
org.junit.platform:junit-platform-suite-api:1.6.2
org.junit.vintage:junit-vintage-engine:5.6.2
org.junit:junit-bom:5.6.2
org.jvnet.staxex:stax-ex:1.8
org.mockito:mockito-core:3.3.3
@@ -297,6 +297,5 @@ org.tukaani:xz:1.8
org.w3c.css:sac:1.3
org.xerial.snappy:snappy-java:1.1.4
org.yaml:snakeyaml:1.17
pl.pragmatists:JUnitParams:1.1.1
xerces:xmlParserAPIs:2.6.2
xpp3:xpp3:1.1.4c

View File

@@ -22,21 +22,35 @@ import java.lang.reflect.Method;
import java.lang.reflect.Proxy;
/**
* Sets up a placeholder {@link Environment} on a non-AppEngine platform so that Datastore Entities
* can be converted from/to Objectify entities. See {@code DatastoreEntityExtension} in test source
* for more information.
* Sets up a fake {@link Environment} so that the following operations can be performed without the
* Datastore service:
*
* <ul>
* <li>Create Objectify {@code Keys}.
* <li>Instantiate Objectify objects.
* <li>Convert Datastore {@code Entities} to their corresponding Objectify objects.
* </ul>
*
* <p>User has the option to specify their desired {@code appId} string, which forms part of an
* Objectify {@code Key} and is included in the equality check. This feature makes it easy to
* compare a migrated object in SQL with the original in Objectify.
*
* <p>Note that conversion from Objectify objects to Datastore {@code Entities} still requires the
* Datastore service.
*/
public class AppEngineEnvironment implements Closeable {
private static final Environment PLACEHOLDER_ENV = createAppEngineEnvironment();
private boolean isPlaceHolderNeeded;
public AppEngineEnvironment() {
this("PlaceholderAppId");
}
public AppEngineEnvironment(String appId) {
isPlaceHolderNeeded = ApiProxy.getCurrentEnvironment() == null;
// isPlaceHolderNeeded may be true when we are invoked in a test with AppEngineRule.
if (isPlaceHolderNeeded) {
ApiProxy.setEnvironmentForCurrentThread(PLACEHOLDER_ENV);
ApiProxy.setEnvironmentForCurrentThread(createAppEngineEnvironment(appId));
}
}
@@ -48,7 +62,7 @@ public class AppEngineEnvironment implements Closeable {
}
/** Returns a placeholder {@link Environment} that can return hardcoded AppId and Attributes. */
private static Environment createAppEngineEnvironment() {
private static Environment createAppEngineEnvironment(String appId) {
return (Environment)
Proxy.newProxyInstance(
Environment.class.getClassLoader(),
@@ -56,7 +70,7 @@ public class AppEngineEnvironment implements Closeable {
(Object proxy, Method method, Object[] args) -> {
switch (method.getName()) {
case "getAppId":
return "PlaceholderAppId";
return appId;
case "getAttributes":
return ImmutableMap.<String, Object>of();
default:

View File

@@ -169,10 +169,12 @@ public final class AsyncTaskEnqueuer {
lock.getRelockDuration().isPresent(),
"Lock with ID %s not configured for relock",
lock.getRevisionId());
String backendHostname = appEngineServiceUtils.getServiceHostname("backend");
addTaskToQueueWithRetry(
asyncActionsPushQueue,
TaskOptions.Builder.withUrl(RelockDomainAction.PATH)
.method(Method.POST)
.header("Host", backendHostname)
.param(
RelockDomainAction.OLD_UNLOCK_REVISION_ID_PARAM,
String.valueOf(lock.getRevisionId()))

View File

@@ -54,37 +54,39 @@ public class BeamJpaModule {
private static final String GCS_SCHEME = "gs://";
@Nullable private final String credentialFilePath;
@Nullable private final String sqlAccessInfoFile;
@Nullable private final String cloudKmsProjectId;
/**
* Constructs a new instance of {@link BeamJpaModule}.
*
* <p>Note: it is an unfortunately necessary antipattern to check for the validity of
* credentialFilePath in {@link #provideCloudSqlAccessInfo} rather than in the constructor.
* sqlAccessInfoFile in {@link #provideCloudSqlAccessInfo} rather than in the constructor.
* Unfortunately, this is a restriction imposed upon us by Dagger. Specifically, because we use
* this in at least one 1 {@link google.registry.tools.RegistryTool} command(s), it must be
* instantiated in {@code google.registry.tools.RegistryToolComponent} for all possible commands;
* Dagger doesn't permit it to ever be null. For the vast majority of commands, it will never be
* used (so a null credential file path is fine in those cases).
*
* @param credentialFilePath the path to a Cloud SQL credential file. This must refer to either a
* @param sqlAccessInfoFile the path to a Cloud SQL credential file. This must refer to either a
* real encrypted file on GCS as returned by {@link
* BackupPaths#getCloudSQLCredentialFilePatterns} or an unencrypted file on local filesystem
* with credentials to a test database.
*/
public BeamJpaModule(@Nullable String credentialFilePath) {
this.credentialFilePath = credentialFilePath;
public BeamJpaModule(@Nullable String sqlAccessInfoFile, @Nullable String cloudKmsProjectId) {
this.sqlAccessInfoFile = sqlAccessInfoFile;
this.cloudKmsProjectId = cloudKmsProjectId;
}
/** Returns true if the credential file is on GCS (and therefore expected to be encrypted). */
private boolean isCloudSqlCredential() {
return credentialFilePath.startsWith(GCS_SCHEME);
return sqlAccessInfoFile.startsWith(GCS_SCHEME);
}
@Provides
@Singleton
SqlAccessInfo provideCloudSqlAccessInfo(Lazy<CloudSqlCredentialDecryptor> lazyDecryptor) {
checkArgument(!isNullOrEmpty(credentialFilePath), "Null or empty credentialFilePath");
checkArgument(!isNullOrEmpty(sqlAccessInfoFile), "Null or empty credentialFilePath");
String line = readOnlyLineFromCredentialFile();
if (isCloudSqlCredential()) {
line = lazyDecryptor.get().decrypt(line);
@@ -101,7 +103,7 @@ public class BeamJpaModule {
String readOnlyLineFromCredentialFile() {
try {
ResourceId resourceId = FileSystems.matchSingleFileSpec(credentialFilePath).resourceId();
ResourceId resourceId = FileSystems.matchSingleFileSpec(sqlAccessInfoFile).resourceId();
try (BufferedReader reader =
new BufferedReader(
new InputStreamReader(
@@ -141,8 +143,8 @@ public class BeamJpaModule {
@Provides
@Config("beamCloudKmsProjectId")
static String kmsProjectId() {
return "domain-registry-dev";
String kmsProjectId() {
return cloudKmsProjectId;
}
@Provides

View File

@@ -0,0 +1,75 @@
// Copyright 2020 The Nomulus Authors. All Rights Reserved.
//
// Licensed under the Apache License, Version 2.0 (the "License");
// you may not use this file except in compliance with the License.
// You may obtain a copy of the License at
//
// http://www.apache.org/licenses/LICENSE-2.0
//
// Unless required by applicable law or agreed to in writing, software
// distributed under the License is distributed on an "AS IS" BASIS,
// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
// See the License for the specific language governing permissions and
// limitations under the License.
package google.registry.beam.initsql;
import static com.google.common.base.Preconditions.checkArgument;
import static com.google.common.base.Preconditions.checkNotNull;
import com.google.appengine.api.datastore.Entity;
import java.util.Objects;
/** Helper for manipulating {@code DomainBase} when migrating from Datastore to SQL database */
final class DomainBaseUtil {
private DomainBaseUtil() {}
/**
* Removes {@link google.registry.model.billing.BillingEvent.Recurring}, {@link
* google.registry.model.poll.PollMessage PollMessages} and {@link
* google.registry.model.host.HostResource name servers} from a Datastore {@link Entity} that
* represents an Ofy {@link google.registry.model.domain.DomainBase}. This breaks the cycle of
* foreign key constraints between these entity kinds, allowing {@code DomainBases} to be inserted
* into the SQL database. See {@link InitSqlPipeline} for a use case, where the full {@code
* DomainBases} are written again during the last stage of the pipeline.
*
* <p>The returned object may be in bad state. Specifically, {@link
* google.registry.model.eppcommon.StatusValue#INACTIVE} is not added after name servers are
* removed. This only impacts tests.
*
* <p>This operation is performed on an Datastore {@link Entity} instead of Ofy Java object
* because Objectify requires access to a Datastore service when converting an Ofy object to a
* Datastore {@code Entity}. If we insist on working with Objectify objects, we face a few
* unsatisfactory options:
*
* <ul>
* <li>Connect to our production Datastore, which incurs unnecessary security and code health
* risk.
* <li>Connect to a separate real Datastore instance, which is a waster and overkill.
* <li>Use an in-memory test Datastore, which is a project health risk in that the test
* Datastore would be added to Nomulus' production binary unless we create a separate
* project for this pipeline.
* </ul>
*
* <p>Given our use case, operating on Datastore entities is the best option.
*
* @throws IllegalArgumentException if input does not represent a DomainBase
*/
static Entity removeBillingAndPollAndHosts(Entity domainBase) {
checkNotNull(domainBase, "domainBase");
checkArgument(
Objects.equals(domainBase.getKind(), "DomainBase"),
"Expecting DomainBase, got %s",
domainBase.getKind());
Entity clone = domainBase.clone();
clone.removeProperty("autorenewBillingEvent");
clone.removeProperty("autorenewPollMessage");
clone.removeProperty("deletePollMessage");
clone.removeProperty("nsHosts");
domainBase.getProperties().keySet().stream()
.filter(s -> s.startsWith("transferData."))
.forEach(s -> clone.removeProperty(s));
return clone;
}
}

View File

@@ -0,0 +1,237 @@
// Copyright 2020 The Nomulus Authors. All Rights Reserved.
//
// Licensed under the Apache License, Version 2.0 (the "License");
// you may not use this file except in compliance with the License.
// You may obtain a copy of the License at
//
// http://www.apache.org/licenses/LICENSE-2.0
//
// Unless required by applicable law or agreed to in writing, software
// distributed under the License is distributed on an "AS IS" BASIS,
// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
// See the License for the specific language governing permissions and
// limitations under the License.
package google.registry.beam.initsql;
import static com.google.common.base.Preconditions.checkArgument;
import com.google.common.annotations.VisibleForTesting;
import com.google.common.collect.ImmutableList;
import com.google.common.collect.ImmutableSet;
import com.googlecode.objectify.Key;
import google.registry.backup.AppEngineEnvironment;
import google.registry.backup.VersionedEntity;
import google.registry.beam.initsql.BeamJpaModule.JpaTransactionManagerComponent;
import google.registry.beam.initsql.Transforms.RemoveDomainBaseForeignKeys;
import google.registry.model.billing.BillingEvent;
import google.registry.model.contact.ContactResource;
import google.registry.model.domain.DomainBase;
import google.registry.model.domain.token.AllocationToken;
import google.registry.model.host.HostResource;
import google.registry.model.poll.PollMessage;
import google.registry.model.registrar.Registrar;
import google.registry.model.registrar.RegistrarContact;
import google.registry.model.registry.Registry;
import google.registry.model.reporting.HistoryEntry;
import google.registry.persistence.transaction.JpaTransactionManager;
import java.io.Serializable;
import java.util.Collection;
import java.util.Optional;
import org.apache.beam.sdk.Pipeline;
import org.apache.beam.sdk.PipelineResult;
import org.apache.beam.sdk.transforms.ParDo;
import org.apache.beam.sdk.transforms.SerializableFunction;
import org.apache.beam.sdk.transforms.Wait;
import org.apache.beam.sdk.values.PCollection;
import org.apache.beam.sdk.values.PCollectionTuple;
import org.apache.beam.sdk.values.TupleTag;
import org.joda.time.DateTime;
/**
* A BEAM pipeline that populates a SQL database with data from a Datastore backup.
*
* <p>This pipeline migrates EPP resources and related entities that cross-reference each other. To
* avoid violating foreign key constraints, writes to SQL are ordered by entity kinds. In addition,
* the {@link DomainBase} kind is written twice (see details below). The write order is presented
* below. Although some kinds can be written concurrently, e.g. {@code ContactResource} and {@code
* RegistrarContact}, we do not expect any performance benefit since the limiting resource is the
* number of JDBC connections. Google internal users may refer to <a
* href="http://go/registry-r3-init-sql">the design doc</a> for more information.
*
* <ol>
* <li>{@link Registry}: Assumes that {@code PremiumList} and {@code ReservedList} have been set
* up in the SQL database.
* <li>{@link Registrar}: Logically depends on {@code Registry}, Foreign key not modeled yet.
* <li>{@link ContactResource}: references {@code Registrar}
* <li>{@link RegistrarContact}: references {@code Registrar}.
* <li>Cleansed {@link DomainBase}: with references to {@code BillingEvent}, {@code Recurring},
* {@code Cancellation} and {@code HostResource} removed, still references {@code Registrar}
* and {@code ContactResource}. The removal breaks circular Foreign Key references.
* <li>{@link HostResource}: references {@code DomainBase}.
* <li>{@link HistoryEntry}: maps to one of three SQL entity types and may reference {@code
* Registrar}, {@code ContactResource}, {@code HostResource}, and {@code DomainBase}.
* <li>{@link AllocationToken}: references {@code HistoryEntry}.
* <li>{@link BillingEvent.Recurring}: references {@code Registrar}, {@code DomainBase} and {@code
* HistoryEntry}.
* <li>{@link BillingEvent.OneTime}: references {@code Registrar}, {@code DomainBase}, {@code
* BillingEvent.Recurring}, {@code HistoryEntry} and {@code AllocationToken}.
* <li>{@link BillingEvent.Modification}: SQL model TBD. Will reference {@code Registrar}, {@code
* DomainBase} and {@code BillingEvent.OneTime}.
* <li>{@link BillingEvent.Cancellation}: references {@code Registrar}, {@code DomainBase}, {@code
* BillingEvent.Recurring}, {@code BillingEvent.OneTime}, and {@code HistoryEntry}.
* <li>{@link PollMessage}: references {@code Registrar}, {@code DomainBase}, {@code
* ContactResource}, {@code HostResource}, and {@code HistoryEntry}.
* <li>{@link DomainBase}, original copy from Datastore.
* </ol>
*/
public class InitSqlPipeline implements Serializable {
/**
* Datastore kinds to be written to the SQL database before the cleansed version of {@link
* DomainBase}.
*/
// TODO(weiminyu): include Registry.class when it is modeled in JPA.
private static final ImmutableList<Class<?>> PHASE_ONE_ORDERED =
ImmutableList.of(Registrar.class, ContactResource.class);
/**
* Datastore kinds to be written to the SQL database after the cleansed version of {@link
* DomainBase}.
*
* <p>The following entities are missing from the list:
*
* <ul>
* <li>Those not modeled in JPA yet, e.g., {@code BillingEvent.Modification}.
* <li>Those waiting for sanitation, e.g., {@code HistoryEntry}, which would have duplicate keys
* after converting to SQL model.
* <li>Those that have foreign key constraints on the above.
* </ul>
*/
// TODO(weiminyu): add more entities when available.
private static final ImmutableList<Class<?>> PHASE_TWO_ORDERED =
ImmutableList.of(HostResource.class);
private final InitSqlPipelineOptions options;
private final Pipeline pipeline;
private final SerializableFunction<JpaTransactionManagerComponent, JpaTransactionManager>
jpaGetter;
InitSqlPipeline(InitSqlPipelineOptions options) {
this.options = options;
pipeline = Pipeline.create(options);
jpaGetter = JpaTransactionManagerComponent::cloudSqlJpaTransactionManager;
}
@VisibleForTesting
InitSqlPipeline(InitSqlPipelineOptions options, Pipeline pipeline) {
this.options = options;
this.pipeline = pipeline;
jpaGetter = JpaTransactionManagerComponent::localDbJpaTransactionManager;
}
public PipelineResult run() {
setupPipeline();
return pipeline.run();
}
@VisibleForTesting
void setupPipeline() {
PCollectionTuple datastoreSnapshot =
pipeline.apply(
"Load Datastore snapshot",
Transforms.loadDatastoreSnapshot(
options.getDatastoreExportDir(),
options.getCommitLogDir(),
DateTime.parse(options.getCommitLogStartTimestamp()),
DateTime.parse(options.getCommitLogEndTimestamp()),
ImmutableSet.<String>builder()
.add("DomainBase")
.addAll(toKindStrings(PHASE_ONE_ORDERED))
.addAll(toKindStrings(PHASE_TWO_ORDERED))
.build()));
// Set up the pipeline to write entity kinds from PHASE_ONE_ORDERED to SQL. Return a object
// that signals the completion of the phase.
PCollection<Void> blocker =
scheduleOnePhaseWrites(datastoreSnapshot, PHASE_ONE_ORDERED, Optional.empty(), null);
blocker =
writeToSql(
"DomainBase without circular foreign keys",
removeDomainBaseForeignKeys(datastoreSnapshot)
.apply("Wait on phase one", Wait.on(blocker)));
// Set up the pipeline to write entity kinds from PHASE_TWO_ORDERED to SQL. This phase won't
// start until all cleansed DomainBases have been written (started by line above).
scheduleOnePhaseWrites(
datastoreSnapshot, PHASE_TWO_ORDERED, Optional.of(blocker), "DomainBaseNoFkeys");
}
private PCollection<VersionedEntity> removeDomainBaseForeignKeys(
PCollectionTuple datastoreSnapshot) {
PCollection<VersionedEntity> domainBases =
datastoreSnapshot.get(Transforms.createTagForKind("DomainBase"));
return domainBases.apply(
"Remove circular foreign keys from DomainBase",
ParDo.of(new RemoveDomainBaseForeignKeys()));
}
/**
* Sets up the pipeline to write entities in {@code entityClasses} to SQL. Entities are written
* one kind at a time based on each kind's position in {@code entityClasses}. Concurrency exists
* within each kind.
*
* @param datastoreSnapshot the Datastore snapshot of all data to be migrated to SQL
* @param entityClasses the entity types in write order
* @param blockingPCollection the pipeline stage that blocks this phase
* @param blockingTag description of the stage (if exists) that blocks this phase. Needed for
* generating unique transform ids
* @return the output {@code PCollection} from the writing of the last entity kind. Other parts of
* the pipeline can {@link Wait} on this object
*/
private PCollection<Void> scheduleOnePhaseWrites(
PCollectionTuple datastoreSnapshot,
Collection<Class<?>> entityClasses,
Optional<PCollection<Void>> blockingPCollection,
String blockingTag) {
checkArgument(!entityClasses.isEmpty(), "Each phase must have at least one kind.");
ImmutableList<TupleTag<VersionedEntity>> tags =
toKindStrings(entityClasses).stream()
.map(Transforms::createTagForKind)
.collect(ImmutableList.toImmutableList());
PCollection<Void> prev = blockingPCollection.orElse(null);
String prevTag = blockingTag;
for (TupleTag<VersionedEntity> tag : tags) {
PCollection<VersionedEntity> curr = datastoreSnapshot.get(tag);
if (prev != null) {
curr = curr.apply("Wait on " + prevTag, Wait.on(prev));
}
prev = writeToSql(tag.getId(), curr);
prevTag = tag.getId();
}
return prev;
}
private PCollection<Void> writeToSql(String transformId, PCollection<VersionedEntity> data) {
String credentialFileUrl =
options.getSqlCredentialUrlOverride() != null
? options.getSqlCredentialUrlOverride()
: BackupPaths.getCloudSQLCredentialFilePatterns(options.getEnvironment()).get(0);
return data.apply(
"Write to sql: " + transformId,
Transforms.writeToSql(
transformId,
options.getMaxConcurrentSqlWriters(),
options.getSqlWriteBatchSize(),
new JpaSupplierFactory(credentialFileUrl, options.getCloudKmsProjectId(), jpaGetter)));
}
private static ImmutableList<String> toKindStrings(Collection<Class<?>> entityClasses) {
try (AppEngineEnvironment env = new AppEngineEnvironment()) {
return entityClasses.stream().map(Key::getKind).collect(ImmutableList.toImmutableList());
}
}
}

View File

@@ -0,0 +1,91 @@
// Copyright 2020 The Nomulus Authors. All Rights Reserved.
//
// Licensed under the Apache License, Version 2.0 (the "License");
// you may not use this file except in compliance with the License.
// You may obtain a copy of the License at
//
// http://www.apache.org/licenses/LICENSE-2.0
//
// Unless required by applicable law or agreed to in writing, software
// distributed under the License is distributed on an "AS IS" BASIS,
// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
// See the License for the specific language governing permissions and
// limitations under the License.
package google.registry.beam.initsql;
import javax.annotation.Nullable;
import org.apache.beam.sdk.extensions.gcp.options.GcpOptions;
import org.apache.beam.sdk.options.Default;
import org.apache.beam.sdk.options.Description;
import org.apache.beam.sdk.options.Validation;
/** Pipeline options for {@link InitSqlPipeline} */
public interface InitSqlPipelineOptions extends GcpOptions {
@Description(
"Overrides the URL to the SQL credential file. " + "Required if environment is not provided.")
@Nullable
String getSqlCredentialUrlOverride();
void setSqlCredentialUrlOverride(String credentialUrlOverride);
@Description("The root directory of the export to load.")
String getDatastoreExportDir();
void setDatastoreExportDir(String datastoreExportDir);
@Description("The directory that contains all CommitLog files.")
String getCommitLogDir();
void setCommitLogDir(String commitLogDir);
@Description("The earliest CommitLogs to load, in ISO8601 format.")
@Validation.Required
String getCommitLogStartTimestamp();
void setCommitLogStartTimestamp(String commitLogStartTimestamp);
@Description("The latest CommitLogs to load, in ISO8601 format.")
@Validation.Required
String getCommitLogEndTimestamp();
void setCommitLogEndTimestamp(String commitLogEndTimestamp);
@Description(
"The deployed environment, alpha, crash, sandbox, or production. "
+ "Not required only if sqlCredentialUrlOverride is provided.")
@Nullable
String getEnvironment();
void setEnvironment(String environment);
@Description(
"The GCP project that contains the keyring used for decrypting the " + "SQL credential file.")
@Nullable
String getCloudKmsProjectId();
void setCloudKmsProjectId(String cloudKmsProjectId);
@Description(
"The maximum JDBC connection pool size on a VM. "
+ "This value should be equal to or greater than the number of cores on the VM.")
@Default.Integer(4)
int getJdbcMaxPoolSize();
void setJdbcMaxPoolSize(int jdbcMaxPoolSize);
@Description(
"A hint to the pipeline runner of the maximum number of concurrent SQL writers to create. "
+ "Note that multiple writers may run on the same VM and share the connection pool.")
@Default.Integer(4)
int getMaxConcurrentSqlWriters();
void setMaxConcurrentSqlWriters(int maxConcurrentSqlWriters);
@Description("The number of entities to be written to the SQL database in one transaction.")
@Default.Integer(20)
int getSqlWriteBatchSize();
void setSqlWriteBatchSize(int sqlWriteBatchSize);
}

View File

@@ -0,0 +1,48 @@
// Copyright 2020 The Nomulus Authors. All Rights Reserved.
//
// Licensed under the Apache License, Version 2.0 (the "License");
// you may not use this file except in compliance with the License.
// You may obtain a copy of the License at
//
// http://www.apache.org/licenses/LICENSE-2.0
//
// Unless required by applicable law or agreed to in writing, software
// distributed under the License is distributed on an "AS IS" BASIS,
// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
// See the License for the specific language governing permissions and
// limitations under the License.
package google.registry.beam.initsql;
import google.registry.beam.initsql.BeamJpaModule.JpaTransactionManagerComponent;
import google.registry.beam.initsql.Transforms.SerializableSupplier;
import google.registry.persistence.transaction.JpaTransactionManager;
import javax.annotation.Nullable;
import org.apache.beam.sdk.transforms.SerializableFunction;
public class JpaSupplierFactory implements SerializableSupplier<JpaTransactionManager> {
private static final long serialVersionUID = 1L;
private final String credentialFileUrl;
@Nullable private final String cloudKmsProjectId;
private final SerializableFunction<JpaTransactionManagerComponent, JpaTransactionManager>
jpaGetter;
public JpaSupplierFactory(
String credentialFileUrl,
@Nullable String cloudKmsProjectId,
SerializableFunction<JpaTransactionManagerComponent, JpaTransactionManager> jpaGetter) {
this.credentialFileUrl = credentialFileUrl;
this.cloudKmsProjectId = cloudKmsProjectId;
this.jpaGetter = jpaGetter;
}
@Override
public JpaTransactionManager get() {
return jpaGetter.apply(
DaggerBeamJpaModule_JpaTransactionManagerComponent.builder()
.beamJpaModule(new BeamJpaModule(credentialFileUrl, cloudKmsProjectId))
.build());
}
}

View File

@@ -1,3 +1,17 @@
## Summary
This package contains a BEAM pipeline that populates a Cloud SQL database from a Datastore backup.
This package contains a BEAM pipeline that populates a Cloud SQL database from a
Datastore backup. The pipeline uses an unsynchronized Datastore export and
overlapping CommitLogs generated by the Nomulus server to recreate a consistent
Datastore snapshot, and writes the data to a Cloud SQL instance.
## Pipeline Visualization
The golden flow graph of the InitSqlPipeline is saved both as a text-base
[DOT file](../../../../../../test/resources/google/registry/beam/initsql/pipeline_golden.dot)
and a
[.png file](../../../../../../test/resources/google/registry/beam/initsql/pipeline_golden.png).
A test compares the flow graph of the current pipeline with the golden graph,
and will fail if changes are detected. When this happens, run the Gradle task
':core:updateInitSqlPipelineGraph' to update the golden files and review the
changes.

View File

@@ -17,8 +17,10 @@ package google.registry.beam.initsql;
import static com.google.common.base.Preconditions.checkArgument;
import static com.google.common.base.Preconditions.checkNotNull;
import static com.google.common.base.Preconditions.checkState;
import static com.google.common.base.Throwables.throwIfUnchecked;
import static google.registry.beam.initsql.BackupPaths.getCommitLogTimestamp;
import static google.registry.beam.initsql.BackupPaths.getExportFilePatterns;
import static google.registry.persistence.JpaRetries.isFailedTxnRetriable;
import static google.registry.persistence.transaction.TransactionManagerFactory.jpaTm;
import static google.registry.persistence.transaction.TransactionManagerFactory.setJpaTm;
import static google.registry.util.DateTimeUtils.START_OF_TIME;
@@ -29,14 +31,16 @@ import static org.apache.beam.sdk.values.TypeDescriptors.kvs;
import static org.apache.beam.sdk.values.TypeDescriptors.strings;
import avro.shaded.com.google.common.collect.Iterators;
import com.google.appengine.api.datastore.Entity;
import com.google.appengine.api.datastore.EntityTranslator;
import com.google.common.annotations.VisibleForTesting;
import com.google.common.base.Throwables;
import com.google.common.collect.ImmutableList;
import com.google.common.collect.ImmutableMap;
import com.google.common.collect.Streams;
import google.registry.backup.AppEngineEnvironment;
import google.registry.backup.CommitLogImports;
import google.registry.backup.VersionedEntity;
import google.registry.model.domain.DomainBase;
import google.registry.model.ofy.ObjectifyService;
import google.registry.model.ofy.Ofy;
import google.registry.persistence.transaction.JpaTransactionManager;
@@ -49,7 +53,6 @@ import java.util.Optional;
import java.util.Set;
import java.util.concurrent.ThreadLocalRandom;
import java.util.function.Supplier;
import javax.persistence.OptimisticLockException;
import org.apache.beam.sdk.coders.StringUtf8Coder;
import org.apache.beam.sdk.io.Compression;
import org.apache.beam.sdk.io.FileIO;
@@ -70,7 +73,6 @@ import org.apache.beam.sdk.values.PBegin;
import org.apache.beam.sdk.values.PCollection;
import org.apache.beam.sdk.values.PCollectionList;
import org.apache.beam.sdk.values.PCollectionTuple;
import org.apache.beam.sdk.values.PDone;
import org.apache.beam.sdk.values.TupleTag;
import org.apache.beam.sdk.values.TupleTagList;
import org.apache.beam.sdk.values.TypeDescriptor;
@@ -225,7 +227,7 @@ public final class Transforms {
return new PTransform<PCollection<String>, PCollection<Metadata>>() {
@Override
public PCollection<Metadata> expand(PCollection<String> input) {
return input.apply(FileIO.matchAll().withEmptyMatchTreatment(EmptyMatchTreatment.DISALLOW));
return input.apply(FileIO.matchAll().withEmptyMatchTreatment(EmptyMatchTreatment.ALLOW));
}
};
}
@@ -263,6 +265,11 @@ public final class Transforms {
/**
* Returns a {@link PTransform} that writes a {@link PCollection} of entities to a SQL database.
* and outputs an empty {@code PCollection<Void>}. This allows other operations to {@link
* org.apache.beam.sdk.transforms.Wait wait} for the completion of this transform.
*
* <p>Errors are handled according to the pipeline runner's default policy. As part of a one-time
* job, we will not add features unless proven necessary.
*
* @param transformId a unique ID for an instance of the returned transform
* @param maxWriters the max number of concurrent writes to SQL, which also determines the max
@@ -270,22 +277,21 @@ public final class Transforms {
* @param batchSize the number of entities to write in each operation
* @param jpaSupplier supplier of a {@link JpaTransactionManager}
*/
public static PTransform<PCollection<VersionedEntity>, PDone> writeToSql(
public static PTransform<PCollection<VersionedEntity>, PCollection<Void>> writeToSql(
String transformId,
int maxWriters,
int batchSize,
SerializableSupplier<JpaTransactionManager> jpaSupplier) {
return new PTransform<PCollection<VersionedEntity>, PDone>() {
return new PTransform<PCollection<VersionedEntity>, PCollection<Void>>() {
@Override
public PDone expand(PCollection<VersionedEntity> input) {
input
public PCollection<Void> expand(PCollection<VersionedEntity> input) {
return input
.apply(
"Shard data for " + transformId,
MapElements.into(kvs(integers(), TypeDescriptor.of(VersionedEntity.class)))
.via(ve -> KV.of(ThreadLocalRandom.current().nextInt(maxWriters), ve)))
.apply("Batch output by shard " + transformId, GroupIntoBatches.ofSize(batchSize))
.apply("Write in batch for " + transformId, ParDo.of(new SqlBatchWriter(jpaSupplier)));
return PDone.in(input.getPipeline());
}
};
}
@@ -397,8 +403,10 @@ public final class Transforms {
public void setup() {
sleeper = new SystemSleeper();
ObjectifyService.initOfy();
ofy = ObjectifyService.ofy();
try (AppEngineEnvironment env = new AppEngineEnvironment()) {
ObjectifyService.initOfy();
ofy = ObjectifyService.ofy();
}
synchronized (SqlBatchWriter.class) {
if (instanceCount == 0) {
@@ -444,7 +452,10 @@ public final class Transforms {
runnable.run();
return;
} catch (Throwable throwable) {
throwIfNotCausedBy(throwable, OptimisticLockException.class);
if (!isFailedTxnRetriable(throwable)) {
throwIfUnchecked(throwable);
throw new RuntimeException(throwable);
}
int sleepMillis = (1 << attempt) * initialDelayMillis;
int jitter =
ThreadLocalRandom.current().nextInt((int) (sleepMillis * jitterRatio))
@@ -453,21 +464,28 @@ public final class Transforms {
}
}
}
}
/**
* Rethrows {@code throwable} if it is not (and does not have a cause of) {@code causeType};
* otherwise returns with no side effects.
*/
private void throwIfNotCausedBy(Throwable throwable, Class<? extends Throwable> causeType) {
Throwable t = throwable;
while (t != null) {
if (causeType.isInstance(t)) {
return;
}
t = t.getCause();
}
Throwables.throwIfUnchecked(t);
throw new RuntimeException(t);
/**
* Removes BillingEvents, {@link google.registry.model.poll.PollMessage PollMessages} and {@link
* google.registry.model.host.HostResource} from a {@link DomainBase}. These are circular foreign
* key constraints that prevent migration of {@code DomainBase} to SQL databases.
*
* <p>See {@link InitSqlPipeline} for more information.
*/
static class RemoveDomainBaseForeignKeys extends DoFn<VersionedEntity, VersionedEntity> {
@ProcessElement
public void processElement(
@Element VersionedEntity domainBase, OutputReceiver<VersionedEntity> out) {
checkArgument(
domainBase.getEntity().isPresent(), "Unexpected delete entity %s", domainBase.key());
Entity outputEntity =
DomainBaseUtil.removeBillingAndPollAndHosts(domainBase.getEntity().get());
out.output(
VersionedEntity.from(
domainBase.commitTimeMills(),
EntityTranslator.convertToPb(outputEntity).toByteArray()));
}
}
}

View File

@@ -1527,6 +1527,21 @@ public final class RegistryConfig {
return CONFIG_SETTINGS.get().hibernate.hikariIdleTimeout;
}
/**
* Returns whether to replicate cloud SQL transactions to datastore.
*
* <p>If true, all cloud SQL transactions will be persisted as TransactionEntity objects in the
* Transaction table and replayed against datastore in a cron job.
*/
public static boolean getCloudSqlReplicateTransactions() {
return CONFIG_SETTINGS.get().cloudSql.replicateTransactions;
}
@VisibleForTesting
public static void overrideCloudSqlReplicateTransactions(boolean replicateTransactions) {
CONFIG_SETTINGS.get().cloudSql.replicateTransactions = replicateTransactions;
}
/** Returns the roid suffix to be used for the roids of all contacts and hosts. */
public static String getContactAndHostRoidSuffix() {
return CONFIG_SETTINGS.get().registryPolicy.contactAndHostRoidSuffix;

View File

@@ -122,6 +122,7 @@ public class RegistryConfigSettings {
public String jdbcUrl;
public String username;
public String instanceConnectionName;
public boolean replicateTransactions;
}
/** Configuration for Apache Beam (Cloud Dataflow). */

View File

@@ -230,6 +230,9 @@ cloudSql:
username: username
# This name is used by Cloud SQL when connecting to the database.
instanceConnectionName: project-id:region:instance-id
# Set this to true to replicate cloud SQL transactions to datastore in the
# background.
replicateTransactions: false
cloudDns:
# Set both properties to null in Production.

View File

@@ -1,38 +0,0 @@
#!/bin/bash
# Copyright 2017 The Nomulus Authors. All Rights Reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
# Generate javadoc for the project
if (( $# != 3 )); then
echo "Usage: $0 JAVADOC ZIP OUT" 1>&2
exit 1
fi
JAVADOC_BINARY="$1"
ZIP_BINARY="$2"
TARGETFILE="$3"
TMPDIR="$(mktemp -d "${TMPDIR:-/tmp}/generate_javadoc.XXXXXXXX")"
PWDDIR="$(pwd)"
"${JAVADOC_BINARY}" -d "${TMPDIR}" \
$(find java -name \*.java) \
-tag error:t:'EPP Errors' \
-subpackages google.registry \
-exclude google.registry.dns:google.registry.proxy:google.registry.monitoring.blackbox
cd "${TMPDIR}"
"${PWDDIR}/${ZIP_BINARY}" -rXoq "${PWDDIR}/${TARGETFILE}" .
cd -
rm -rf "${TMPDIR}"

View File

@@ -347,8 +347,8 @@ public class DomainCreateFlow implements TransactionalFlow {
.setRepoId(repoId)
.setIdnTableName(validateDomainNameWithIdnTables(domainName))
.setRegistrationExpirationTime(registrationExpirationTime)
.setAutorenewBillingEvent(Key.create(autorenewBillingEvent))
.setAutorenewPollMessage(Key.create(autorenewPollMessage))
.setAutorenewBillingEvent(autorenewBillingEvent.createVKey())
.setAutorenewPollMessage(autorenewPollMessage.createVKey())
.setLaunchNotice(hasClaimsNotice ? launchCreate.get().getNotice() : null)
.setSmdId(signedMarkId)
.setDsData(secDnsCreate.isPresent() ? secDnsCreate.get().getDsData() : null)

View File

@@ -31,7 +31,6 @@ import static google.registry.model.ResourceTransferUtils.handlePendingTransferO
import static google.registry.model.ResourceTransferUtils.updateForeignKeyIndexDeletionTime;
import static google.registry.model.eppoutput.Result.Code.SUCCESS;
import static google.registry.model.eppoutput.Result.Code.SUCCESS_WITH_ACTION_PENDING;
import static google.registry.model.ofy.ObjectifyService.ofy;
import static google.registry.model.reporting.DomainTransactionRecord.TransactionReportField.ADD_FIELDS;
import static google.registry.model.reporting.DomainTransactionRecord.TransactionReportField.RENEW_FIELDS;
import static google.registry.persistence.transaction.TransactionManagerFactory.tm;
@@ -209,7 +208,7 @@ public final class DomainDeleteFlow implements TransactionalFlow {
PollMessage.OneTime deletePollMessage =
createDeletePollMessage(existingDomain, historyEntry, deletionTime);
entitiesToSave.add(deletePollMessage);
builder.setDeletePollMessage(Key.create(deletePollMessage));
builder.setDeletePollMessage(deletePollMessage.createVKey());
}
// Cancel any grace periods that were still active, and set the expiration time accordingly.
@@ -222,8 +221,7 @@ public final class DomainDeleteFlow implements TransactionalFlow {
if (gracePeriod.getOneTimeBillingEvent() != null) {
// Take the amount of amount of registration time being refunded off the expiration time.
// This can be either add grace periods or renew grace periods.
BillingEvent.OneTime oneTime =
ofy().load().key(gracePeriod.getOneTimeBillingEvent()).now();
BillingEvent.OneTime oneTime = tm().load(gracePeriod.getOneTimeBillingEvent());
newExpirationTime = newExpirationTime.minusYears(oneTime.getPeriodYears());
} else if (gracePeriod.getRecurringBillingEvent() != null) {
// Take 1 year off the registration if in the autorenew grace period (no need to load the
@@ -370,12 +368,12 @@ public final class DomainDeleteFlow implements TransactionalFlow {
private Money getGracePeriodCost(GracePeriod gracePeriod, DateTime now) {
if (gracePeriod.getType() == GracePeriodStatus.AUTO_RENEW) {
DateTime autoRenewTime =
ofy().load().key(checkNotNull(gracePeriod.getRecurringBillingEvent())).now()
tm().load(checkNotNull(gracePeriod.getRecurringBillingEvent()))
.getRecurrenceTimeOfYear()
.getLastInstanceBeforeOrAt(now);
.getLastInstanceBeforeOrAt(now);
return getDomainRenewCost(targetId, autoRenewTime, 1);
}
return ofy().load().key(checkNotNull(gracePeriod.getOneTimeBillingEvent())).now().getCost();
return tm().load(checkNotNull(gracePeriod.getOneTimeBillingEvent())).getCost();
}
@Nullable

View File

@@ -517,14 +517,14 @@ public class DomainFlowUtils {
*/
public static void updateAutorenewRecurrenceEndTime(DomainBase domain, DateTime newEndTime) {
Optional<PollMessage.Autorenew> autorenewPollMessage =
Optional.ofNullable(ofy().load().key(domain.getAutorenewPollMessage()).now());
tm().maybeLoad(domain.getAutorenewPollMessage());
// Construct an updated autorenew poll message. If the autorenew poll message no longer exists,
// create a new one at the same id. This can happen if a transfer was requested on a domain
// where all autorenew poll messages had already been delivered (this would cause the poll
// message to be deleted), and then subsequently the transfer was canceled, rejected, or deleted
// (which would cause the poll message to be recreated here).
Key<PollMessage.Autorenew> existingAutorenewKey = domain.getAutorenewPollMessage();
Key<PollMessage.Autorenew> existingAutorenewKey = domain.getAutorenewPollMessage().getOfyKey();
PollMessage.Autorenew updatedAutorenewPollMessage =
autorenewPollMessage.isPresent()
? autorenewPollMessage.get().asBuilder().setAutorenewEndTime(newEndTime).build()
@@ -542,7 +542,7 @@ public class DomainFlowUtils {
ofy().save().entity(updatedAutorenewPollMessage);
}
Recurring recurring = ofy().load().key(domain.getAutorenewBillingEvent()).now();
Recurring recurring = tm().load(domain.getAutorenewBillingEvent());
ofy().save().entity(recurring.asBuilder().setRecurrenceEndTime(newEndTime).build());
}

View File

@@ -33,7 +33,6 @@ import static google.registry.util.DateTimeUtils.leapSafeAddYears;
import com.google.common.collect.ImmutableList;
import com.google.common.collect.ImmutableSet;
import com.googlecode.objectify.Key;
import google.registry.flows.EppException;
import google.registry.flows.EppException.ParameterValueRangeErrorException;
import google.registry.flows.ExtensionManager;
@@ -181,8 +180,8 @@ public final class DomainRenewFlow implements TransactionalFlow {
.setLastEppUpdateTime(now)
.setLastEppUpdateClientId(clientId)
.setRegistrationExpirationTime(newExpirationTime)
.setAutorenewBillingEvent(Key.create(newAutorenewEvent))
.setAutorenewPollMessage(Key.create(newAutorenewPollMessage))
.setAutorenewBillingEvent(newAutorenewEvent.createVKey())
.setAutorenewPollMessage(newAutorenewPollMessage.createVKey())
.addGracePeriod(
GracePeriod.forBillingEvent(GracePeriodStatus.RENEW, explicitRenewEvent))
.build();

View File

@@ -26,7 +26,6 @@ import static google.registry.flows.domain.DomainFlowUtils.verifyNotReserved;
import static google.registry.flows.domain.DomainFlowUtils.verifyPremiumNameIsNotBlocked;
import static google.registry.flows.domain.DomainFlowUtils.verifyRegistrarIsActive;
import static google.registry.model.ResourceTransferUtils.updateForeignKeyIndexDeletionTime;
import static google.registry.model.ofy.ObjectifyService.ofy;
import static google.registry.persistence.transaction.TransactionManagerFactory.tm;
import static google.registry.util.DateTimeUtils.END_OF_TIME;
@@ -174,8 +173,8 @@ public final class DomainRestoreRequestFlow implements TransactionalFlow {
existingDomain, newExpirationTime, autorenewEvent, autorenewPollMessage, now, clientId);
updateForeignKeyIndexDeletionTime(newDomain);
entitiesToSave.add(newDomain, historyEntry, autorenewEvent, autorenewPollMessage);
ofy().save().entities(entitiesToSave.build());
ofy().delete().key(existingDomain.getDeletePollMessage());
tm().saveNewOrUpdateAll(entitiesToSave.build());
tm().delete(existingDomain.getDeletePollMessage());
dnsQueue.addDomainRefreshTask(existingDomain.getDomainName());
return responseBuilder
.setExtensions(createResponseExtensions(feesAndCredits, feeUpdate, isExpired))
@@ -232,8 +231,8 @@ public final class DomainRestoreRequestFlow implements TransactionalFlow {
.setStatusValues(null)
.setGracePeriods(null)
.setDeletePollMessage(null)
.setAutorenewBillingEvent(Key.create(autorenewEvent))
.setAutorenewPollMessage(Key.create(autorenewPollMessage))
.setAutorenewBillingEvent(autorenewEvent.createVKey())
.setAutorenewPollMessage(autorenewPollMessage.createVKey())
.setLastEppUpdateTime(now)
.setLastEppUpdateClientId(clientId)
.build();

View File

@@ -186,8 +186,8 @@ public final class DomainTransferApproveFlow implements TransactionalFlow {
.setTransferredRegistrationExpirationTime(newExpirationTime)
.build())
.setRegistrationExpirationTime(newExpirationTime)
.setAutorenewBillingEvent(Key.create(autorenewEvent))
.setAutorenewPollMessage(Key.create(gainingClientAutorenewPollMessage))
.setAutorenewBillingEvent(autorenewEvent.createVKey())
.setAutorenewPollMessage(gainingClientAutorenewPollMessage.createVKey())
// Remove all the old grace periods and add a new one for the transfer.
.setGracePeriods(
billingEvent.isPresent()

View File

@@ -630,9 +630,9 @@ public abstract class BillingEvent extends ImmutableObject
.setParent(historyEntry);
// Set the grace period's billing event using the appropriate Cancellation builder method.
if (gracePeriod.getOneTimeBillingEvent() != null) {
builder.setOneTimeEventKey(VKey.from(gracePeriod.getOneTimeBillingEvent()));
builder.setOneTimeEventKey(gracePeriod.getOneTimeBillingEvent());
} else if (gracePeriod.getRecurringBillingEvent() != null) {
builder.setRecurringEventKey(VKey.from(gracePeriod.getRecurringBillingEvent()));
builder.setRecurringEventKey(gracePeriod.getRecurringBillingEvent());
}
return builder.build();
}

View File

@@ -227,7 +227,8 @@ public class DomainBase extends EppResource
* refer to a {@link PollMessage} timed to when the domain is fully deleted. If the domain is
* restored, the message should be deleted.
*/
@Transient Key<PollMessage.OneTime> deletePollMessage;
@Column(name = "deletion_poll_message_id")
VKey<PollMessage.OneTime> deletePollMessage;
/**
* The recurring billing event associated with this domain's autorenewals.
@@ -237,7 +238,8 @@ public class DomainBase extends EppResource
* {@link #registrationExpirationTime} is changed the recurrence should be closed, a new one
* should be created, and this field should be updated to point to the new one.
*/
@Transient Key<BillingEvent.Recurring> autorenewBillingEvent;
@Column(name = "billing_recurrence_id")
VKey<BillingEvent.Recurring> autorenewBillingEvent;
/**
* The recurring poll message associated with this domain's autorenewals.
@@ -247,7 +249,8 @@ public class DomainBase extends EppResource
* {@link #registrationExpirationTime} is changed the recurrence should be closed, a new one
* should be created, and this field should be updated to point to the new one.
*/
@Transient Key<PollMessage.Autorenew> autorenewPollMessage;
@Column(name = "autorenew_poll_message_id")
VKey<PollMessage.Autorenew> autorenewPollMessage;
/** The unexpired grace periods for this domain (some of which may not be active yet). */
@Transient @ElementCollection Set<GracePeriod> gracePeriods;
@@ -316,15 +319,15 @@ public class DomainBase extends EppResource
return registrationExpirationTime;
}
public Key<PollMessage.OneTime> getDeletePollMessage() {
public VKey<PollMessage.OneTime> getDeletePollMessage() {
return deletePollMessage;
}
public Key<BillingEvent.Recurring> getAutorenewBillingEvent() {
public VKey<BillingEvent.Recurring> getAutorenewBillingEvent() {
return autorenewBillingEvent;
}
public Key<PollMessage.Autorenew> getAutorenewPollMessage() {
public VKey<PollMessage.Autorenew> getAutorenewPollMessage() {
return autorenewPollMessage;
}
@@ -453,14 +456,8 @@ public class DomainBase extends EppResource
.setRegistrationExpirationTime(expirationDate)
// Set the speculatively-written new autorenew events as the domain's autorenew
// events.
.setAutorenewBillingEvent(
transferData.getServerApproveAutorenewEvent() == null
? null
: transferData.getServerApproveAutorenewEvent().getOfyKey())
.setAutorenewPollMessage(
transferData.getServerApproveAutorenewPollMessage() == null
? null
: transferData.getServerApproveAutorenewPollMessage().getOfyKey());
.setAutorenewBillingEvent(transferData.getServerApproveAutorenewEvent())
.setAutorenewPollMessage(transferData.getServerApproveAutorenewPollMessage());
if (transferData.getTransferPeriod().getValue() == 1) {
// Set the grace period using a key to the prescheduled transfer billing event. Not using
// GracePeriod.forBillingEvent() here in order to avoid the actual Datastore fetch.
@@ -471,9 +468,7 @@ public class DomainBase extends EppResource
transferExpirationTime.plus(
Registry.get(getTld()).getTransferGracePeriodLength()),
transferData.getGainingClientId(),
transferData.getServerApproveBillingEvent() == null
? null
: transferData.getServerApproveBillingEvent().getOfyKey())));
transferData.getServerApproveBillingEvent())));
} else {
// There won't be a billing event, so we don't need a grace period
builder.setGracePeriods(ImmutableSet.of());
@@ -801,19 +796,17 @@ public class DomainBase extends EppResource
return this;
}
public Builder setDeletePollMessage(Key<PollMessage.OneTime> deletePollMessage) {
public Builder setDeletePollMessage(VKey<PollMessage.OneTime> deletePollMessage) {
getInstance().deletePollMessage = deletePollMessage;
return this;
}
public Builder setAutorenewBillingEvent(
Key<BillingEvent.Recurring> autorenewBillingEvent) {
public Builder setAutorenewBillingEvent(VKey<BillingEvent.Recurring> autorenewBillingEvent) {
getInstance().autorenewBillingEvent = autorenewBillingEvent;
return this;
}
public Builder setAutorenewPollMessage(
Key<PollMessage.Autorenew> autorenewPollMessage) {
public Builder setAutorenewPollMessage(VKey<PollMessage.Autorenew> autorenewPollMessage) {
getInstance().autorenewPollMessage = autorenewPollMessage;
return this;
}

View File

@@ -17,12 +17,13 @@ package google.registry.model.domain;
import static com.google.common.base.Preconditions.checkArgument;
import static google.registry.util.PreconditionsUtils.checkArgumentNotNull;
import com.googlecode.objectify.Key;
import com.googlecode.objectify.annotation.Embed;
import com.googlecode.objectify.annotation.Ignore;
import google.registry.model.ImmutableObject;
import google.registry.model.billing.BillingEvent;
import google.registry.model.billing.BillingEvent.Recurring;
import google.registry.model.domain.rgp.GracePeriodStatus;
import google.registry.persistence.VKey;
import javax.annotation.Nullable;
import javax.persistence.Column;
import javax.persistence.GeneratedValue;
@@ -57,18 +58,18 @@ public class GracePeriod extends ImmutableObject {
/**
* The one-time billing event corresponding to the action that triggered this grace period, or
* null if not applicable. Not set for autorenew grace periods (which instead use the field
* {@code billingEventRecurring}) or for redemption grace periods (since deletes have no cost).
* null if not applicable. Not set for autorenew grace periods (which instead use the field {@code
* billingEventRecurring}) or for redemption grace periods (since deletes have no cost).
*/
// NB: Would @IgnoreSave(IfNull.class), but not allowed for @Embed collections.
Key<BillingEvent.OneTime> billingEventOneTime = null;
VKey<BillingEvent.OneTime> billingEventOneTime = null;
/**
* The recurring billing event corresponding to the action that triggered this grace period, if
* applicable - i.e. if the action was an autorenew - or null in all other cases.
*/
// NB: Would @IgnoreSave(IfNull.class), but not allowed for @Embed collections.
Key<BillingEvent.Recurring> billingEventRecurring = null;
VKey<BillingEvent.Recurring> billingEventRecurring = null;
public GracePeriodStatus getType() {
return type;
@@ -91,8 +92,7 @@ public class GracePeriod extends ImmutableObject {
* Returns the one time billing event. The value will only be non-null if the type of this grace
* period is not AUTO_RENEW.
*/
public Key<BillingEvent.OneTime> getOneTimeBillingEvent() {
public VKey<BillingEvent.OneTime> getOneTimeBillingEvent() {
return billingEventOneTime;
}
@@ -100,16 +100,16 @@ public class GracePeriod extends ImmutableObject {
* Returns the recurring billing event. The value will only be non-null if the type of this grace
* period is AUTO_RENEW.
*/
public Key<BillingEvent.Recurring> getRecurringBillingEvent() {
public VKey<BillingEvent.Recurring> getRecurringBillingEvent() {
return billingEventRecurring;
}
private static GracePeriod createInternal(
GracePeriodStatus type,
DateTime expirationTime,
String clientId,
@Nullable Key<BillingEvent.OneTime> billingEventOneTime,
@Nullable Key<BillingEvent.Recurring> billingEventRecurring) {
GracePeriodStatus type,
DateTime expirationTime,
String clientId,
@Nullable VKey<BillingEvent.OneTime> billingEventOneTime,
@Nullable VKey<BillingEvent.Recurring> billingEventRecurring) {
checkArgument((billingEventOneTime == null) || (billingEventRecurring == null),
"A grace period can have at most one billing event");
checkArgument(
@@ -127,15 +127,15 @@ public class GracePeriod extends ImmutableObject {
/**
* Creates a GracePeriod for an (optional) OneTime billing event.
*
* <p>Normal callers should always use {@link #forBillingEvent} instead, assuming they do not
* need to avoid loading the BillingEvent from Datastore. This method should typically be
* called only from test code to explicitly construct GracePeriods.
* <p>Normal callers should always use {@link #forBillingEvent} instead, assuming they do not need
* to avoid loading the BillingEvent from Datastore. This method should typically be called only
* from test code to explicitly construct GracePeriods.
*/
public static GracePeriod create(
GracePeriodStatus type,
DateTime expirationTime,
String clientId,
@Nullable Key<BillingEvent.OneTime> billingEventOneTime) {
@Nullable VKey<BillingEvent.OneTime> billingEventOneTime) {
return createInternal(type, expirationTime, clientId, billingEventOneTime, null);
}
@@ -144,7 +144,7 @@ public class GracePeriod extends ImmutableObject {
GracePeriodStatus type,
DateTime expirationTime,
String clientId,
Key<BillingEvent.Recurring> billingEventRecurring) {
VKey<Recurring> billingEventRecurring) {
checkArgumentNotNull(billingEventRecurring, "billingEventRecurring cannot be null");
return createInternal(type, expirationTime, clientId, null, billingEventRecurring);
}
@@ -159,6 +159,6 @@ public class GracePeriod extends ImmutableObject {
public static GracePeriod forBillingEvent(
GracePeriodStatus type, BillingEvent.OneTime billingEvent) {
return create(
type, billingEvent.getBillingTime(), billingEvent.getClientId(), Key.create(billingEvent));
type, billingEvent.getBillingTime(), billingEvent.getClientId(), billingEvent.createVKey());
}
}

View File

@@ -48,6 +48,10 @@ public final class RdeRevision extends ImmutableObject {
*/
int revision;
public int getRevision() {
return revision;
}
/**
* Returns next revision ID to use when staging a new deposit file for the given triplet.
*

View File

@@ -14,7 +14,9 @@
package google.registry.model.registry.label;
import static com.google.common.base.Charsets.US_ASCII;
import static com.google.common.base.Preconditions.checkArgument;
import static com.google.common.hash.Funnels.stringFunnel;
import static com.google.common.hash.Funnels.unencodedCharsFunnel;
import static google.registry.config.RegistryConfig.getDomainLabelListCacheDuration;
import static google.registry.config.RegistryConfig.getSingletonCachePersistDuration;
@@ -32,43 +34,82 @@ import com.google.common.cache.CacheLoader;
import com.google.common.cache.CacheLoader.InvalidCacheLoadException;
import com.google.common.cache.LoadingCache;
import com.google.common.collect.ImmutableList;
import com.google.common.collect.ImmutableMap;
import com.google.common.hash.BloomFilter;
import com.google.common.util.concurrent.UncheckedExecutionException;
import com.googlecode.objectify.Key;
import com.googlecode.objectify.annotation.Entity;
import com.googlecode.objectify.annotation.Id;
import com.googlecode.objectify.annotation.Ignore;
import com.googlecode.objectify.annotation.Parent;
import google.registry.model.Buildable;
import google.registry.model.ImmutableObject;
import google.registry.model.annotations.ReportedOn;
import google.registry.model.registry.Registry;
import google.registry.schema.replay.DatastoreAndSqlEntity;
import google.registry.schema.replay.DatastoreEntity;
import google.registry.schema.replay.SqlEntity;
import google.registry.schema.tld.PremiumListDao;
import google.registry.util.NonFinalForTesting;
import java.io.ByteArrayOutputStream;
import java.io.IOException;
import java.math.BigDecimal;
import java.util.List;
import java.util.Map;
import java.util.Objects;
import java.util.Optional;
import java.util.Set;
import java.util.concurrent.ExecutionException;
import javax.annotation.Nullable;
import javax.persistence.CollectionTable;
import javax.persistence.Column;
import javax.persistence.ElementCollection;
import javax.persistence.Index;
import javax.persistence.JoinColumn;
import javax.persistence.MapKeyColumn;
import javax.persistence.PostLoad;
import javax.persistence.PrePersist;
import javax.persistence.Table;
import javax.persistence.Transient;
import org.hibernate.LazyInitializationException;
import org.joda.money.CurrencyUnit;
import org.joda.money.Money;
import org.joda.time.Duration;
/** A premium list entity, persisted to Datastore, that is used to check domain label prices. */
/**
* A premium list entity that is used to check domain label prices.
*
* <p>Note that the primary key of this entity is {@link #revisionId}, which is auto-generated by
* the database. So, if a retry of insertion happens after the previous attempt unexpectedly
* succeeds, we will end up with having two exact same premium lists that differ only by revisionId.
* This is fine though, because we only use the list with the highest revisionId.
*/
@ReportedOn
@Entity
@javax.persistence.Entity
@Table(indexes = {@Index(columnList = "name", name = "premiumlist_name_idx")})
public final class PremiumList extends BaseDomainLabelList<Money, PremiumList.PremiumListEntry>
implements DatastoreEntity {
implements DatastoreAndSqlEntity {
/** Stores the revision key for the set of currently used premium list entry entities. */
Key<PremiumListRevision> revisionKey;
@Transient Key<PremiumListRevision> revisionKey;
@Override
public ImmutableList<SqlEntity> toSqlEntities() {
return ImmutableList.of(); // PremiumList is dual-written
}
@Ignore
@Column(nullable = false)
CurrencyUnit currency;
@Ignore
@ElementCollection
@CollectionTable(
name = "PremiumEntry",
joinColumns = @JoinColumn(name = "revisionId", referencedColumnName = "revisionId"))
@MapKeyColumn(name = "domainLabel")
@Column(name = "price", nullable = false)
Map<String, BigDecimal> labelsToPrices;
@Ignore
@Column(nullable = false)
BloomFilter<String> bloomFilter;
/** Virtual parent entity for premium list entry entities associated with a single revision. */
@ReportedOn
@@ -247,6 +288,35 @@ public final class PremiumList extends BaseDomainLabelList<Money, PremiumList.Pr
return Optional.ofNullable(loadPremiumList(name));
}
/** Returns the {@link CurrencyUnit} used for this list. */
public CurrencyUnit getCurrency() {
return currency;
}
/**
* Returns a {@link Map} of domain labels to prices.
*
* <p>Note that this is lazily loaded and thus will throw a {@link LazyInitializationException} if
* used outside the transaction in which the given entity was loaded. You generally should not be
* using this anyway as it's inefficient to load all of the PremiumEntry rows if you don't need
* them. To check prices, use {@link PremiumListDao#getPremiumPrice} instead.
*/
@Nullable
public ImmutableMap<String, BigDecimal> getLabelsToPrices() {
return labelsToPrices == null ? null : ImmutableMap.copyOf(labelsToPrices);
}
/**
* Returns a Bloom filter to determine whether a label might be premium, or is definitely not.
*
* <p>If the domain label might be premium, then the next step is to check for the existence of a
* corresponding row in the PremiumListEntry table. Otherwise, we know for sure it's not premium,
* and no DB load is required.
*/
public BloomFilter<String> getBloomFilter() {
return bloomFilter;
}
/**
* A premium list entry entity, persisted to Datastore. Each instance represents the price of a
* single label on a given TLD.
@@ -339,9 +409,39 @@ public final class PremiumList extends BaseDomainLabelList<Money, PremiumList.Pr
return this;
}
public Builder setCurrency(CurrencyUnit currency) {
getInstance().currency = currency;
return this;
}
public Builder setLabelsToPrices(Map<String, BigDecimal> labelsToPrices) {
getInstance().labelsToPrices = ImmutableMap.copyOf(labelsToPrices);
return this;
}
@Override
public PremiumList build() {
if (getInstance().labelsToPrices != null) {
// ASCII is used for the charset because all premium list domain labels are stored
// punycoded.
getInstance().bloomFilter =
BloomFilter.create(stringFunnel(US_ASCII), getInstance().labelsToPrices.size());
getInstance()
.labelsToPrices
.keySet()
.forEach(label -> getInstance().bloomFilter.put(label));
}
return super.build();
}
}
@PrePersist
void prePersist() {
lastUpdateTime = creationTime;
}
@PostLoad
void postLoad() {
creationTime = lastUpdateTime;
}
}

View File

@@ -38,10 +38,10 @@ public enum ReservationType {
ALLOWED_IN_SUNRISE("Reserved", 0),
/** The domain can only be registered by providing a specific token. */
RESERVED_FOR_SPECIFIC_USE("Allocation token required", 1),
RESERVED_FOR_SPECIFIC_USE("Reserved; alloc. token required", 1),
/** The domain is for an anchor tenant and can only be registered using a specific token. */
RESERVED_FOR_ANCHOR_TENANT("Allocation token required", 2),
RESERVED_FOR_ANCHOR_TENANT("Reserved; alloc. token required", 2),
/**
* The domain can only be registered during sunrise for defensive purposes, and will never

View File

@@ -0,0 +1,58 @@
// Copyright 2019 The Nomulus Authors. All Rights Reserved.
//
// Licensed under the Apache License, Version 2.0 (the "License");
// you may not use this file except in compliance with the License.
// You may obtain a copy of the License at
//
// http://www.apache.org/licenses/LICENSE-2.0
//
// Unless required by applicable law or agreed to in writing, software
// distributed under the License is distributed on an "AS IS" BASIS,
// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
// See the License for the specific language governing permissions and
// limitations under the License.
package google.registry.persistence;
import com.google.common.base.Predicates;
import com.google.common.collect.ImmutableSet;
import java.sql.SQLException;
import java.util.function.Predicate;
import javax.persistence.OptimisticLockException;
/** Helpers for identifying retriable database operations. */
public final class JpaRetries {
private JpaRetries() {}
private static final ImmutableSet<String> RETRIABLE_TXN_SQL_STATE =
ImmutableSet.of(
"40001", // serialization_failure
"40P01", // deadlock_detected, PSQL-specific
"55006", // object_in_use, PSQL and DB2
"55P03" // lock_not_available, PSQL-specific
);
private static final Predicate<Throwable> RETRIABLE_TXN_PREDICATE =
Predicates.or(
OptimisticLockException.class::isInstance,
e ->
e instanceof SQLException
&& RETRIABLE_TXN_SQL_STATE.contains(((SQLException) e).getSQLState()));
public static boolean isFailedTxnRetriable(Throwable throwable) {
Throwable t = throwable;
while (t != null) {
if (RETRIABLE_TXN_PREDICATE.test(t)) {
return true;
}
t = t.getCause();
}
return false;
}
public static boolean isFailedQueryRetriable(Throwable throwable) {
// TODO(weiminyu): check for more error codes.
return isFailedTxnRetriable(throwable);
}
}

View File

@@ -26,6 +26,7 @@ import com.google.common.collect.ImmutableList;
import com.google.common.collect.ImmutableMap;
import com.google.common.collect.ImmutableSet;
import com.google.common.flogger.FluentLogger;
import google.registry.config.RegistryConfig;
import google.registry.persistence.VKey;
import google.registry.util.Clock;
import java.lang.reflect.Field;
@@ -101,9 +102,9 @@ public class JpaTransactionManagerImpl implements JpaTransactionManager {
EntityTransaction txn = txnInfo.entityManager.getTransaction();
try {
txn.begin();
txnInfo.inTransaction = true;
txnInfo.transactionTime = clock.nowUtc();
txnInfo.start(clock);
T result = work.get();
txnInfo.recordTransaction();
txn.commit();
return result;
} catch (RuntimeException | Error e) {
@@ -177,6 +178,7 @@ public class JpaTransactionManagerImpl implements JpaTransactionManager {
checkArgumentNotNull(entity, "entity must be specified");
assertInTransaction();
getEntityManager().persist(entity);
transactionInfo.get().addUpdate(entity);
}
@Override
@@ -191,6 +193,7 @@ public class JpaTransactionManagerImpl implements JpaTransactionManager {
checkArgumentNotNull(entity, "entity must be specified");
assertInTransaction();
getEntityManager().merge(entity);
transactionInfo.get().addUpdate(entity);
}
@Override
@@ -206,6 +209,7 @@ public class JpaTransactionManagerImpl implements JpaTransactionManager {
assertInTransaction();
checkArgument(checkExists(entity), "Given entity does not exist");
getEntityManager().merge(entity);
transactionInfo.get().addUpdate(entity);
}
@Override
@@ -297,6 +301,7 @@ public class JpaTransactionManagerImpl implements JpaTransactionManager {
String.format("DELETE FROM %s WHERE %s", entityType.getName(), getAndClause(entityIds));
Query query = getEntityManager().createQuery(sql);
entityIds.forEach(entityId -> query.setParameter(entityId.name, entityId.value));
transactionInfo.get().addDelete(key);
return query.executeUpdate();
}
@@ -387,9 +392,23 @@ public class JpaTransactionManagerImpl implements JpaTransactionManager {
boolean inTransaction = false;
DateTime transactionTime;
// Serializable representation of the transaction to be persisted in the Transaction table.
Transaction.Builder contentsBuilder;
/** Start a new transaction. */
private void start(Clock clock) {
checkArgumentNotNull(clock);
inTransaction = true;
transactionTime = clock.nowUtc();
if (RegistryConfig.getCloudSqlReplicateTransactions()) {
contentsBuilder = new Transaction.Builder();
}
}
private void clear() {
inTransaction = false;
transactionTime = null;
contentsBuilder = null;
if (entityManager != null) {
// Close this EntityManager just let the connection pool be able to reuse it, it doesn't
// close the underlying database connection.
@@ -397,5 +416,26 @@ public class JpaTransactionManagerImpl implements JpaTransactionManager {
entityManager = null;
}
}
private void addUpdate(Object entity) {
if (contentsBuilder != null) {
contentsBuilder.addUpdate(entity);
}
}
private void addDelete(VKey<?> key) {
if (contentsBuilder != null) {
contentsBuilder.addDelete(key);
}
}
private void recordTransaction() {
if (contentsBuilder != null) {
Transaction persistedTxn = contentsBuilder.build();
if (!persistedTxn.isEmpty()) {
entityManager.persist(persistedTxn.toEntity());
}
}
}
}
}

View File

@@ -109,11 +109,20 @@ public class Transaction extends ImmutableObject implements Buildable {
return builder.build();
}
/** Returns true if the transaction contains no mutations. */
public boolean isEmpty() {
return mutations.isEmpty();
}
@Override
public Builder asBuilder() {
return new Builder(clone(this));
}
public final TransactionEntity toEntity() {
return new TransactionEntity(serialize());
}
public static class Builder extends GenericBuilder<Transaction, Builder> {
ImmutableList.Builder listBuilder = new ImmutableList.Builder();

View File

@@ -0,0 +1,43 @@
// Copyright 2020 The Nomulus Authors. All Rights Reserved.
//
// Licensed under the Apache License, Version 2.0 (the "License");
// you may not use this file except in compliance with the License.
// You may obtain a copy of the License at
//
// http://www.apache.org/licenses/LICENSE-2.0
//
// Unless required by applicable law or agreed to in writing, software
// distributed under the License is distributed on an "AS IS" BASIS,
// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
// See the License for the specific language governing permissions and
// limitations under the License.
package google.registry.persistence.transaction;
import javax.persistence.Entity;
import javax.persistence.GeneratedValue;
import javax.persistence.GenerationType;
import javax.persistence.Id;
import javax.persistence.Table;
/**
* Object to be stored in the transaction table.
*
* <p>This consists of a sequential identifier and a serialized {@code Tranaction} object.
*/
@Entity
@Table(name = "Transaction")
public class TransactionEntity {
@Id
@GeneratedValue(strategy = GenerationType.IDENTITY)
long id;
byte[] contents;
TransactionEntity() {}
TransactionEntity(byte[] contents) {
this.contents = contents;
}
}

View File

@@ -29,7 +29,7 @@ public enum RdeResourceType {
DOMAIN("urn:ietf:params:xml:ns:rdeDomain-1.0", EnumSet.of(FULL, THIN)),
HOST("urn:ietf:params:xml:ns:rdeHost-1.0", EnumSet.of(FULL)),
REGISTRAR("urn:ietf:params:xml:ns:rdeRegistrar-1.0", EnumSet.of(FULL, THIN)),
IDN("urn:ietf:params:xml:ns:rdeIDN-1.0", EnumSet.of(FULL, THIN)),
IDN("urn:ietf:params:xml:ns:rdeIDN-1.0", EnumSet.of(FULL)),
HEADER("urn:ietf:params:xml:ns:rdeHeader-1.0", EnumSet.of(FULL, THIN));
private final String uri;

View File

@@ -77,7 +77,7 @@ public final class RdeStagingReducer extends Reducer<PendingDeposit, DepositFrag
private final byte[] stagingKeyBytes;
private final RdeMarshaller marshaller;
private RdeStagingReducer(
RdeStagingReducer(
TaskQueueUtils taskQueueUtils,
LockHandler lockHandler,
int gcsBufferSize,
@@ -125,7 +125,7 @@ public final class RdeStagingReducer extends Reducer<PendingDeposit, DepositFrag
final DateTime watermark = key.watermark();
final int revision =
Optional.ofNullable(key.revision())
.orElse(RdeRevision.getNextRevision(tld, watermark, mode));
.orElseGet(() -> RdeRevision.getNextRevision(tld, watermark, mode));
String id = RdeUtil.timestampToId(watermark);
String prefix = RdeNamingUtils.makeRydeFilename(tld, watermark, mode, 1, revision);
if (key.manual()) {
@@ -168,9 +168,13 @@ public final class RdeStagingReducer extends Reducer<PendingDeposit, DepositFrag
logger.atSevere().log("Fragment error: %s", fragment.error());
}
}
for (IdnTableEnum idn : IdnTableEnum.values()) {
output.write(marshaller.marshalIdn(idn.getTable()));
counter.increment(RdeResourceType.IDN);
// Don't write the IDN elements for BRDA.
if (mode == RdeMode.FULL) {
for (IdnTableEnum idn : IdnTableEnum.values()) {
output.write(marshaller.marshalIdn(idn.getTable()));
counter.increment(RdeResourceType.IDN);
}
}
// Output XML that says how many resources were emitted.

View File

@@ -16,6 +16,7 @@ package google.registry.schema.tld;
import com.google.common.collect.ImmutableList;
import google.registry.model.ImmutableObject;
import google.registry.model.registry.label.PremiumList;
import google.registry.schema.replay.DatastoreEntity;
import google.registry.schema.replay.SqlEntity;
import java.io.Serializable;

View File

@@ -1,151 +0,0 @@
// Copyright 2019 The Nomulus Authors. All Rights Reserved.
//
// Licensed under the Apache License, Version 2.0 (the "License");
// you may not use this file except in compliance with the License.
// You may obtain a copy of the License at
//
// http://www.apache.org/licenses/LICENSE-2.0
//
// Unless required by applicable law or agreed to in writing, software
// distributed under the License is distributed on an "AS IS" BASIS,
// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
// See the License for the specific language governing permissions and
// limitations under the License.
package google.registry.schema.tld;
import static com.google.common.base.Charsets.US_ASCII;
import static com.google.common.base.Preconditions.checkState;
import static com.google.common.hash.Funnels.stringFunnel;
import com.google.common.collect.ImmutableList;
import com.google.common.collect.ImmutableMap;
import com.google.common.hash.BloomFilter;
import google.registry.model.CreateAutoTimestamp;
import google.registry.schema.replay.DatastoreEntity;
import google.registry.schema.replay.SqlEntity;
import java.math.BigDecimal;
import java.util.Map;
import javax.annotation.Nullable;
import javax.persistence.CollectionTable;
import javax.persistence.Column;
import javax.persistence.ElementCollection;
import javax.persistence.Entity;
import javax.persistence.GeneratedValue;
import javax.persistence.GenerationType;
import javax.persistence.Id;
import javax.persistence.Index;
import javax.persistence.JoinColumn;
import javax.persistence.MapKeyColumn;
import javax.persistence.Table;
import org.hibernate.LazyInitializationException;
import org.joda.money.CurrencyUnit;
import org.joda.time.DateTime;
/**
* A list of premium prices for domain names.
*
* <p>Note that the primary key of this entity is {@link #revisionId}, which is auto-generated by
* the database. So, if a retry of insertion happens after the previous attempt unexpectedly
* succeeds, we will end up with having two exact same premium lists that differ only by revisionId.
* This is fine though, because we only use the list with the highest revisionId.
*/
@Entity
@Table(indexes = {@Index(columnList = "name", name = "premiumlist_name_idx")})
public class PremiumList implements SqlEntity {
@Column(nullable = false)
private String name;
@Id
@GeneratedValue(strategy = GenerationType.IDENTITY)
@Column(nullable = false)
private Long revisionId;
@Column(nullable = false)
private CreateAutoTimestamp creationTimestamp = CreateAutoTimestamp.create(null);
@Column(nullable = false)
private CurrencyUnit currency;
@ElementCollection
@CollectionTable(
name = "PremiumEntry",
joinColumns = @JoinColumn(name = "revisionId", referencedColumnName = "revisionId"))
@MapKeyColumn(name = "domainLabel")
@Column(name = "price", nullable = false)
private Map<String, BigDecimal> labelsToPrices;
@Column(nullable = false)
private BloomFilter<String> bloomFilter;
private PremiumList(String name, CurrencyUnit currency, Map<String, BigDecimal> labelsToPrices) {
this.name = name;
this.currency = currency;
this.labelsToPrices = labelsToPrices;
// ASCII is used for the charset because all premium list domain labels are stored punycoded.
this.bloomFilter = BloomFilter.create(stringFunnel(US_ASCII), labelsToPrices.size());
labelsToPrices.keySet().forEach(this.bloomFilter::put);
}
// Hibernate requires this default constructor.
private PremiumList() {}
/** Constructs a {@link PremiumList} object. */
public static PremiumList create(
String name, CurrencyUnit currency, Map<String, BigDecimal> labelsToPrices) {
return new PremiumList(name, currency, labelsToPrices);
}
/** Returns the name of the premium list, which is usually also a TLD string. */
public String getName() {
return name;
}
/** Returns the {@link CurrencyUnit} used for this list. */
public CurrencyUnit getCurrency() {
return currency;
}
/** Returns the ID of this revision, or throws if null. */
public Long getRevisionId() {
checkState(
revisionId != null,
"revisionId is null because this object has not yet been persisted to the DB");
return revisionId;
}
/** Returns the creation time of this revision of the premium list. */
public DateTime getCreationTimestamp() {
return creationTimestamp.getTimestamp();
}
/**
* Returns a {@link Map} of domain labels to prices.
*
* <p>Note that this is lazily loaded and thus will throw a {@link LazyInitializationException} if
* used outside the transaction in which the given entity was loaded. You generally should not be
* using this anyway as it's inefficient to load all of the PremiumEntry rows if you don't need
* them. To check prices, use {@link PremiumListDao#getPremiumPrice} instead.
*/
@Nullable
public ImmutableMap<String, BigDecimal> getLabelsToPrices() {
return labelsToPrices == null ? null : ImmutableMap.copyOf(labelsToPrices);
}
/**
* Returns a Bloom filter to determine whether a label might be premium, or is definitely not.
*
* <p>If the domain label might be premium, then the next step is to check for the existence of a
* corresponding row in the PremiumListEntry table. Otherwise, we know for sure it's not premium,
* and no DB load is required.
*/
public BloomFilter<String> getBloomFilter() {
return bloomFilter;
}
@Override
public ImmutableList<DatastoreEntity> toDatastoreEntities() {
return ImmutableList.of(); // PremiumList is dual-written
}
}

View File

@@ -25,6 +25,7 @@ import com.google.common.annotations.VisibleForTesting;
import com.google.common.cache.CacheBuilder;
import com.google.common.cache.CacheLoader;
import com.google.common.cache.LoadingCache;
import google.registry.model.registry.label.PremiumList;
import google.registry.util.NonFinalForTesting;
import java.math.BigDecimal;
import java.util.Optional;

View File

@@ -20,6 +20,7 @@ import static google.registry.persistence.transaction.TransactionManagerFactory.
import com.google.common.cache.CacheLoader.InvalidCacheLoadException;
import com.google.common.util.concurrent.UncheckedExecutionException;
import google.registry.model.registry.Registry;
import google.registry.model.registry.label.PremiumList;
import google.registry.schema.tld.PremiumListCache.RevisionIdAndLabel;
import java.math.BigDecimal;
import java.util.Optional;

View File

@@ -16,6 +16,7 @@ package google.registry.schema.tld;
import static com.google.common.base.Preconditions.checkArgument;
import static com.google.common.collect.ImmutableSet.toImmutableSet;
import static org.joda.time.DateTimeZone.UTC;
import com.google.common.base.Splitter;
import com.google.common.collect.ImmutableMap;
@@ -23,11 +24,13 @@ import com.google.common.collect.ImmutableSet;
import com.google.common.collect.ImmutableSortedSet;
import com.google.common.collect.Iterables;
import com.google.common.collect.Maps;
import google.registry.model.registry.label.PremiumList;
import google.registry.model.registry.label.PremiumList.PremiumListEntry;
import java.math.BigDecimal;
import java.util.List;
import java.util.Map;
import org.joda.money.CurrencyUnit;
import org.joda.time.DateTime;
/** Static utility methods for {@link PremiumList}. */
public class PremiumListUtils {
@@ -37,10 +40,7 @@ public class PremiumListUtils {
Splitter.on('\n').omitEmptyStrings().splitToList(inputData);
ImmutableMap<String, PremiumListEntry> prices =
new google.registry.model.registry.label.PremiumList.Builder()
.setName(name)
.build()
.parse(inputDataPreProcessed);
new PremiumList.Builder().setName(name).build().parse(inputDataPreProcessed);
ImmutableSet<CurrencyUnit> currencies =
prices.values().stream()
.map(e -> e.getValue().getCurrencyUnit())
@@ -54,7 +54,12 @@ public class PremiumListUtils {
Map<String, BigDecimal> priceAmounts =
Maps.transformValues(prices, ple -> ple.getValue().getAmount());
return PremiumList.create(name, currency, priceAmounts);
return new PremiumList.Builder()
.setName(name)
.setCurrency(currency)
.setLabelsToPrices(priceAmounts)
.setCreationTime(DateTime.now(UTC))
.build();
}
private PremiumListUtils() {}

View File

@@ -16,17 +16,51 @@ package google.registry.tools;
import com.beust.jcommander.Parameters;
import google.registry.beam.spec11.Spec11Pipeline;
import google.registry.config.CredentialModule.LocalCredential;
import google.registry.config.RegistryConfig.Config;
import google.registry.util.GoogleCredentialsBundle;
import google.registry.util.Retrier;
import javax.annotation.Nullable;
import javax.inject.Inject;
/** Nomulus command that deploys the {@link Spec11Pipeline} template. */
@Parameters(commandDescription = "Deploy the Spec11 pipeline to GCS.")
public class DeploySpec11PipelineCommand implements Command {
@Inject Spec11Pipeline spec11Pipeline;
@Inject
@Config("projectId")
String projectId;
@Inject
@Config("beamStagingUrl")
String beamStagingUrl;
@Inject
@Config("spec11TemplateUrl")
String spec11TemplateUrl;
@Inject
@Config("reportingBucketUrl")
String reportingBucketUrl;
@Inject @LocalCredential GoogleCredentialsBundle googleCredentialsBundle;
@Inject Retrier retrier;
@Inject
@Nullable
@Config("sqlAccessInfoFile")
String sqlAccessInfoFile;
@Override
public void run() {
spec11Pipeline.deploy();
Spec11Pipeline pipeline =
new Spec11Pipeline(
projectId,
beamStagingUrl,
spec11TemplateUrl,
reportingBucketUrl,
googleCredentialsBundle,
retrier);
pipeline.deploy();
}
}

View File

@@ -117,7 +117,7 @@ public final class DomainLockUtils {
RegistryLock newLock =
RegistryLockDao.save(lock.asBuilder().setLockCompletionTimestamp(now).build());
setAsRelock(newLock);
tm().transact(() -> applyLockStatuses(newLock, now));
tm().transact(() -> applyLockStatuses(newLock, now, isAdmin));
return newLock;
});
}
@@ -171,7 +171,7 @@ public final class DomainLockUtils {
createLockBuilder(domainName, registrarId, registrarPocId, isAdmin)
.setLockCompletionTimestamp(now)
.build());
tm().transact(() -> applyLockStatuses(newLock, now));
tm().transact(() -> applyLockStatuses(newLock, now, isAdmin));
setAsRelock(newLock);
return newLock;
});
@@ -222,18 +222,18 @@ public final class DomainLockUtils {
String domainName, String registrarId, @Nullable String registrarPocId, boolean isAdmin) {
DateTime now = jpaTm().getTransactionTime();
DomainBase domainBase = getDomain(domainName, registrarId, now);
verifyDomainNotLocked(domainBase);
verifyDomainNotLocked(domainBase, isAdmin);
// Multiple pending actions are not allowed
// Multiple pending actions are not allowed for non-admins
RegistryLockDao.getMostRecentByRepoId(domainBase.getRepoId())
.ifPresent(
previousLock ->
checkArgument(
previousLock.isLockRequestExpired(now)
|| previousLock.getUnlockCompletionTimestamp().isPresent(),
|| previousLock.getUnlockCompletionTimestamp().isPresent()
|| isAdmin,
"A pending or completed lock action already exists for %s",
previousLock.getDomainName()));
return new RegistryLock.Builder()
.setVerificationCode(stringGenerator.createString(VERIFICATION_CODE_LENGTH))
.setDomainName(domainName)
@@ -250,6 +250,8 @@ public final class DomainLockUtils {
Optional<RegistryLock> lockOptional =
RegistryLockDao.getMostRecentVerifiedLockByRepoId(domainBase.getRepoId());
verifyDomainLocked(domainBase, isAdmin);
RegistryLock.Builder newLockBuilder;
if (isAdmin) {
// Admins should always be able to unlock domains in case we get in a bad state
@@ -265,7 +267,6 @@ public final class DomainLockUtils {
.setLockCompletionTimestamp(now)
.setRegistrarId(registrarId));
} else {
verifyDomainLocked(domainBase);
RegistryLock lock =
lockOptional.orElseThrow(
() ->
@@ -293,16 +294,17 @@ public final class DomainLockUtils {
.setRegistrarId(registrarId);
}
private static void verifyDomainNotLocked(DomainBase domainBase) {
private static void verifyDomainNotLocked(DomainBase domainBase, boolean isAdmin) {
checkArgument(
!domainBase.getStatusValues().containsAll(REGISTRY_LOCK_STATUSES),
isAdmin || !domainBase.getStatusValues().containsAll(REGISTRY_LOCK_STATUSES),
"Domain %s is already locked",
domainBase.getDomainName());
}
private static void verifyDomainLocked(DomainBase domainBase) {
private static void verifyDomainLocked(DomainBase domainBase, boolean isAdmin) {
checkArgument(
!Sets.intersection(domainBase.getStatusValues(), REGISTRY_LOCK_STATUSES).isEmpty(),
isAdmin
|| !Sets.intersection(domainBase.getStatusValues(), REGISTRY_LOCK_STATUSES).isEmpty(),
"Domain %s is already unlocked",
domainBase.getDomainName());
}
@@ -310,8 +312,7 @@ public final class DomainLockUtils {
private DomainBase getDomain(String domainName, String registrarId, DateTime now) {
DomainBase domain =
loadByForeignKeyCached(DomainBase.class, domainName, now)
.orElseThrow(
() -> new IllegalArgumentException(String.format("Unknown domain %s", domainName)));
.orElseThrow(() -> new IllegalArgumentException("Domain doesn't exist"));
// The user must have specified either the correct registrar ID or the admin registrar ID
checkArgument(
registryAdminRegistrarId.equals(registrarId)
@@ -330,9 +331,9 @@ public final class DomainLockUtils {
String.format("Invalid verification code %s", verificationCode)));
}
private void applyLockStatuses(RegistryLock lock, DateTime lockTime) {
private void applyLockStatuses(RegistryLock lock, DateTime lockTime, boolean isAdmin) {
DomainBase domain = getDomain(lock.getDomainName(), lock.getRegistrarId(), lockTime);
verifyDomainNotLocked(domain);
verifyDomainNotLocked(domain, isAdmin);
DomainBase newDomain =
domain
@@ -345,9 +346,7 @@ public final class DomainLockUtils {
private void removeLockStatuses(RegistryLock lock, boolean isAdmin, DateTime unlockTime) {
DomainBase domain = getDomain(lock.getDomainName(), lock.getRegistrarId(), unlockTime);
if (!isAdmin) {
verifyDomainLocked(domain);
}
verifyDomainLocked(domain, isAdmin);
DomainBase newDomain =
domain

View File

@@ -14,15 +14,7 @@
package google.registry.tools;
import static google.registry.model.EppResourceUtils.loadByForeignKey;
import com.beust.jcommander.Parameters;
import com.google.common.collect.ImmutableSet;
import com.google.common.collect.Sets;
import com.google.common.flogger.FluentLogger;
import google.registry.model.domain.DomainBase;
import google.registry.model.eppcommon.StatusValue;
import org.joda.time.DateTime;
/**
* A command to registry lock domain names.
@@ -32,25 +24,6 @@ import org.joda.time.DateTime;
@Parameters(separators = " =", commandDescription = "Registry lock a domain via EPP.")
public class LockDomainCommand extends LockOrUnlockDomainCommand {
private static final FluentLogger logger = FluentLogger.forEnclosingClass();
@Override
protected boolean shouldApplyToDomain(String domain, DateTime now) {
DomainBase domainBase =
loadByForeignKey(DomainBase.class, domain, now)
.orElseThrow(
() ->
new IllegalArgumentException(
String.format("Domain '%s' does not exist or is deleted", domain)));
ImmutableSet<StatusValue> statusesToAdd =
Sets.difference(REGISTRY_LOCK_STATUSES, domainBase.getStatusValues()).immutableCopy();
if (statusesToAdd.isEmpty()) {
logger.atInfo().log("Domain '%s' is already locked and needs no updates.", domain);
return false;
}
return true;
}
@Override
protected void createAndApplyRequest(String domain) {
domainLockUtils.administrativelyApplyLock(domain, clientId, null, true);

View File

@@ -15,22 +15,24 @@
package google.registry.tools;
import static com.google.common.base.Preconditions.checkArgument;
import static com.google.common.collect.ImmutableSet.toImmutableSet;
import static com.google.common.collect.Iterables.partition;
import static google.registry.model.eppcommon.StatusValue.SERVER_DELETE_PROHIBITED;
import static google.registry.model.eppcommon.StatusValue.SERVER_TRANSFER_PROHIBITED;
import static google.registry.model.eppcommon.StatusValue.SERVER_UPDATE_PROHIBITED;
import static google.registry.persistence.transaction.TransactionManagerFactory.jpaTm;
import static google.registry.persistence.transaction.TransactionManagerFactory.tm;
import static google.registry.util.CollectionUtils.findDuplicates;
import com.beust.jcommander.Parameter;
import com.google.common.base.Joiner;
import com.google.common.collect.ImmutableMap;
import com.google.common.collect.ImmutableSet;
import com.google.common.flogger.FluentLogger;
import google.registry.config.RegistryConfig.Config;
import google.registry.model.eppcommon.StatusValue;
import java.util.List;
import javax.inject.Inject;
import org.joda.time.DateTime;
/** Shared base class for commands to registry lock or unlock a domain via EPP. */
public abstract class LockOrUnlockDomainCommand extends ConfirmingCommand
@@ -78,37 +80,37 @@ public abstract class LockOrUnlockDomainCommand extends ConfirmingCommand
@Override
protected String execute() {
ImmutableSet.Builder<String> successfulDomainsBuilder = new ImmutableSet.Builder<>();
ImmutableSet.Builder<String> skippedDomainsBuilder = new ImmutableSet.Builder<>();
ImmutableSet.Builder<String> failedDomainsBuilder = new ImmutableSet.Builder<>();
ImmutableMap.Builder<String, String> failedDomainsToReasons = new ImmutableMap.Builder<>();
partition(getDomains(), BATCH_SIZE)
.forEach(
batch ->
tm().transact(
() -> {
for (String domain : batch) {
if (shouldApplyToDomain(domain, tm().getTransactionTime())) {
try {
createAndApplyRequest(domain);
} catch (Throwable t) {
logger.atSevere().withCause(t).log(
"Error when (un)locking domain %s.", domain);
failedDomainsBuilder.add(domain);
}
successfulDomainsBuilder.add(domain);
} else {
skippedDomainsBuilder.add(domain);
}
}
}));
// we require that the jpaTm is the outer transaction in DomainLockUtils
jpaTm()
.transact(
() ->
tm().transact(
() -> {
for (String domain : batch) {
try {
createAndApplyRequest(domain);
} catch (Throwable t) {
logger.atSevere().withCause(t).log(
"Error when (un)locking domain %s.", domain);
failedDomainsToReasons.put(domain, t.getMessage());
continue;
}
successfulDomainsBuilder.add(domain);
}
})));
ImmutableSet<String> successfulDomains = successfulDomainsBuilder.build();
ImmutableSet<String> skippedDomains = skippedDomainsBuilder.build();
ImmutableSet<String> failedDomains = failedDomainsBuilder.build();
ImmutableSet<String> failedDomains =
failedDomainsToReasons.build().entrySet().stream()
.map(entry -> String.format("%s (%s)", entry.getKey(), entry.getValue()))
.collect(toImmutableSet());
return String.format(
"Successfully locked/unlocked domains:\n%s\nSkipped domains:\n%s\nFailed domains:\n%s",
successfulDomains, skippedDomains, failedDomains);
"Successfully locked/unlocked domains:\n%s\nFailed domains:\n%s",
successfulDomains, failedDomains);
}
protected abstract boolean shouldApplyToDomain(String domain, DateTime now);
protected abstract void createAndApplyRequest(String domain);
}

View File

@@ -29,7 +29,7 @@ import com.google.appengine.tools.remoteapi.RemoteApiOptions;
import com.google.common.base.Throwables;
import com.google.common.collect.ImmutableMap;
import com.google.common.collect.Iterables;
import google.registry.beam.initsql.BeamJpaModule;
import google.registry.backup.AppEngineEnvironment;
import google.registry.config.RegistryConfig;
import google.registry.model.ofy.ObjectifyService;
import google.registry.persistence.transaction.TransactionManagerFactory;
@@ -66,6 +66,13 @@ final class RegistryCli implements AutoCloseable, CommandRunner {
+ "If not set, credentials saved by running `nomulus login' will be used.")
private String credentialJson = null;
@Parameter(
names = {"--sql_access_info"},
description =
"Name of a file containing space-separated SQL access info used when deploying "
+ "Beam pipelines")
private String sqlAccessInfoFile = null;
// Do not make this final - compile-time constant inlining may interfere with JCommander.
@ParametersDelegate
private LoggingParameters loggingParams = new LoggingParameters();
@@ -161,7 +168,7 @@ final class RegistryCli implements AutoCloseable, CommandRunner {
component =
DaggerRegistryToolComponent.builder()
.credentialFilePath(credentialJson)
.beamJpaModule(new BeamJpaModule(credentialJson))
.sqlAccessInfoFile(sqlAccessInfoFile)
.build();
// JCommander stores sub-commands as nested JCommander objects containing a list of user objects
@@ -172,7 +179,7 @@ final class RegistryCli implements AutoCloseable, CommandRunner {
Iterables.getOnlyElement(jcommander.getCommands().get(parsedCommand).getObjects());
loggingParams.configureLogging(); // Must be called after parameters are parsed.
try {
try (AppEngineEnvironment env = new AppEngineEnvironment()) {
runCommand(command);
} catch (RuntimeException ex) {
if (Throwables.getRootCause(ex) instanceof LoginRequiredException) {

View File

@@ -134,6 +134,9 @@ interface RegistryToolComponent {
@BindsInstance
Builder credentialFilePath(@Nullable @Config("credentialFilePath") String credentialFilePath);
@BindsInstance
Builder sqlAccessInfoFile(@Nullable @Config("sqlAccessInfoFile") String sqlAccessInfoFile);
Builder beamJpaModule(BeamJpaModule beamJpaModule);
RegistryToolComponent build();

View File

@@ -14,16 +14,8 @@
package google.registry.tools;
import static google.registry.model.EppResourceUtils.loadByForeignKey;
import com.beust.jcommander.Parameters;
import com.google.common.collect.ImmutableSet;
import com.google.common.collect.Sets;
import com.google.common.flogger.FluentLogger;
import google.registry.model.domain.DomainBase;
import google.registry.model.eppcommon.StatusValue;
import java.util.Optional;
import org.joda.time.DateTime;
/**
* A command to registry unlock domain names.
@@ -33,25 +25,6 @@ import org.joda.time.DateTime;
@Parameters(separators = " =", commandDescription = "Registry unlock a domain via EPP.")
public class UnlockDomainCommand extends LockOrUnlockDomainCommand {
private static final FluentLogger logger = FluentLogger.forEnclosingClass();
@Override
protected boolean shouldApplyToDomain(String domain, DateTime now) {
DomainBase domainBase =
loadByForeignKey(DomainBase.class, domain, now)
.orElseThrow(
() ->
new IllegalArgumentException(
String.format("Domain '%s' does not exist or is deleted", domain)));
ImmutableSet<StatusValue> statusesToRemove =
Sets.intersection(domainBase.getStatusValues(), REGISTRY_LOCK_STATUSES).immutableCopy();
if (statusesToRemove.isEmpty()) {
logger.atInfo().log("Domain '%s' is already unlocked and needs no updates.", domain);
return false;
}
return true;
}
@Override
protected void createAndApplyRequest(String domain) {
domainLockUtils.administrativelyApplyUnlock(domain, clientId, true, Optional.empty());

View File

@@ -31,7 +31,6 @@ import com.google.common.collect.ImmutableMap;
import com.google.common.collect.ImmutableMultimap;
import com.google.common.collect.ImmutableSet;
import com.google.common.collect.Sets;
import com.googlecode.objectify.Key;
import google.registry.model.billing.BillingEvent;
import google.registry.model.domain.DomainBase;
import google.registry.model.domain.Period;
@@ -224,8 +223,8 @@ class UnrenewDomainCommand extends ConfirmingCommand implements CommandWithRemot
.setRegistrationExpirationTime(newExpirationTime)
.setLastEppUpdateTime(now)
.setLastEppUpdateClientId(domain.getCurrentSponsorClientId())
.setAutorenewBillingEvent(Key.create(newAutorenewEvent))
.setAutorenewPollMessage(Key.create(newAutorenewPollMessage))
.setAutorenewBillingEvent(newAutorenewEvent.createVKey())
.setAutorenewPollMessage(newAutorenewPollMessage.createVKey())
.build();
// In order to do it'll need to write out a new HistoryEntry (likely of type SYNTHETIC), a new
// autorenew billing event and poll message, and a new one time poll message at the present time

View File

@@ -82,7 +82,7 @@ public class CreatePremiumListAction extends CreateOrUpdatePremiumListAction {
logger.atInfo().log("Saving premium list to Cloud SQL for TLD %s", name);
// TODO(mcilwain): Call logInputData() here once Datastore persistence is removed.
google.registry.schema.tld.PremiumList premiumList = parseToPremiumList(name, inputData);
PremiumList premiumList = parseToPremiumList(name, inputData);
PremiumListDao.saveNew(premiumList);
String message =

View File

@@ -74,7 +74,7 @@ public class UpdatePremiumListAction extends CreateOrUpdatePremiumListAction {
protected void saveToCloudSql() {
logger.atInfo().log("Updating premium list '%s' in Cloud SQL.", name);
// TODO(mcilwain): Add logInputData() call here once DB migration is complete.
google.registry.schema.tld.PremiumList premiumList = parseToPremiumList(name, inputData);
PremiumList premiumList = parseToPremiumList(name, inputData);
PremiumListDao.update(premiumList);
String message =
String.format(

View File

@@ -30,6 +30,7 @@ import com.google.common.collect.ImmutableList;
import com.google.common.flogger.FluentLogger;
import com.google.gson.Gson;
import google.registry.config.RegistryConfig.Config;
import google.registry.flows.domain.DomainFlowUtils;
import google.registry.model.registrar.Registrar;
import google.registry.model.registrar.RegistrarContact;
import google.registry.request.Action;
@@ -118,6 +119,7 @@ public class RegistryLockPostAction implements Runnable, JsonActionRunner.JsonAc
String registrarId = postInput.registrarId;
checkArgument(!Strings.isNullOrEmpty(registrarId), "Missing key for registrarId");
checkArgument(!Strings.isNullOrEmpty(postInput.domainName), "Missing key for domainName");
DomainFlowUtils.validateDomainName(postInput.domainName);
checkNotNull(postInput.isLock, "Missing key for isLock");
UserAuthInfo userAuthInfo =
authResult

View File

@@ -92,6 +92,7 @@ registry.registrar.RegistryLock.prototype.fillLocksPage_ = function(e) {
lockEnabledForContact: locksDetails.lockEnabledForContact});
if (locksDetails.lockEnabledForContact) {
this.registryLockEmailAddress = locksDetails.email;
// Listen to the lock-domain 'submit' button click
var lockButton = goog.dom.getRequiredElement('button-lock-domain');
goog.events.listen(lockButton, goog.events.EventType.CLICK, this.onLockDomain_, false, this);
@@ -116,7 +117,10 @@ registry.registrar.RegistryLock.prototype.showModal_ = function(targetElement, d
// attach the modal to the parent element so focus remains correct if the user closes the modal
var modalElement = goog.soy.renderAsElement(
registry.soy.registrar.registrylock.confirmModal,
{domain: domain, isLock: isLock, isAdmin: this.isAdmin});
{domain: domain,
isLock: isLock,
isAdmin: this.isAdmin,
emailAddress: this.registryLockEmailAddress});
parentElement.prepend(modalElement);
if (domain == null) {
goog.dom.getRequiredElement('domain-lock-input-value').focus();

View File

@@ -29,12 +29,13 @@
<class>google.registry.model.host.HostResource</class>
<class>google.registry.model.registrar.Registrar</class>
<class>google.registry.model.registrar.RegistrarContact</class>
<class>google.registry.model.registry.label.PremiumList</class>
<class>google.registry.model.reporting.Spec11ThreatMatch</class>
<class>google.registry.persistence.transaction.TransactionEntity</class>
<class>google.registry.schema.domain.RegistryLock</class>
<class>google.registry.schema.tmch.ClaimsList</class>
<class>google.registry.schema.cursor.Cursor</class>
<class>google.registry.schema.server.Lock</class>
<class>google.registry.schema.tld.PremiumList</class>
<class>google.registry.schema.tld.PremiumEntry</class>
<class>google.registry.model.domain.secdns.DelegationSignerData</class>
<class>google.registry.model.domain.GracePeriod</class>

View File

@@ -107,7 +107,7 @@
{@param readonly: bool}
{@param? registryLockAllowedForRegistrar: bool}
<form name="item" class="{css('item')} {css('registrar')}">
<h1>Contact Details</h1>
<h1>Contact details</h1>
{call .contactInfo data="all"}
{param namePrefix: $namePrefix /}
{param item: $item /}
@@ -142,11 +142,21 @@
{param name: 'name' /}
{/call}
{call registry.soy.forms.inputFieldRow data="all"}
{param label: 'Email' /}
{param label: 'Primary account email' /}
{param namePrefix: $namePrefix /}
{param name: 'emailAddress' /}
{param disabled: not $readonly and $item['emailAddress'] != null /}
{/call}
{if isNonnull($item['registryLockEmailAddress'])}
{call registry.soy.forms.inputFieldRow data="all"}
{param label: 'Registry lock email address' /}
{param namePrefix: $namePrefix /}
{param name: 'registryLockEmailAddress' /}
{param disabled: not $readonly /}
{param description: 'Address to which registry (un)lock confirmation emails will be ' +
'sent. This is not necessarily the account email address that is used for login.' /}
{/call}
{/if}
{call registry.soy.forms.inputFieldRow data="all"}
{param label: 'Phone' /}
{param namePrefix: $namePrefix /}
@@ -176,10 +186,6 @@
{if isNonnull($item['gaeUserId'])}
<input type="hidden" name="{$namePrefix}gaeUserId" value="{$item['gaeUserId']}">
{/if}
{if isNonnull($item['registryLockEmailAddress'])}
<input type="hidden" name="{$namePrefix}registryLockEmailAddress"
value="{$item['registryLockEmailAddress']}">
{/if}
</div>
{/template}
@@ -282,19 +288,19 @@
<hr>
</tr>
{call .whoisVisibleRadios_}
{param description: 'Show in Registrar WHOIS record as Admin contact' /}
{param description: 'Show in Registrar WHOIS record as admin contact' /}
{param fieldName: $namePrefix + 'visibleInWhoisAsAdmin' /}
{param visible: $item['visibleInWhoisAsAdmin'] == true /}
{/call}
{call .whoisVisibleRadios_}
{param description: 'Show in Registrar WHOIS record as Technical contact' /}
{param description: 'Show in Registrar WHOIS record as technical contact' /}
{param fieldName: $namePrefix + 'visibleInWhoisAsTech' /}
{param visible: $item['visibleInWhoisAsTech'] == true /}
{/call}
{call .whoisVisibleRadios_}
{param description:
'Show Phone and Email in Domain WHOIS Record as Registrar Abuse Contact' +
' (Per CL&D Requirements)' /}
'Show Phone and Email in Domain WHOIS Record as registrar abuse contact' +
' (per CL&D requirements)' /}
{param note:
'*Can only apply to one contact. Selecting Yes for this contact will' +
' force this setting for all other contacts to be No.' /}

View File

@@ -115,12 +115,12 @@
{template .confirmModal}
{@param isLock: bool}
{@param isAdmin: bool}
{@param emailAddress: string}
{@param? domain: string|null}
<div id="lock-confirm-modal" class="{css('lock-confirm-modal')}">
<div class="modal-content">
<p>Are you sure you want to {if $isLock}lock a domain{else}unlock the domain {$domain}{/if}?
We will send an email to the email address on file to confirm the {if not $isLock}un{/if}
lock.</p>
We will send an email to {$emailAddress} to confirm the {if not $isLock}un{/if}lock.</p>
<label for="domain-to-lock">Domain: </label>
<input id="domain-lock-input-value"
{if isNonnull($domain)}

View File

@@ -36,6 +36,7 @@ import org.testcontainers.containers.PostgreSQLContainer;
*/
@Parameters(separators = " =", commandDescription = "Generate PostgreSQL schema.")
public class GenerateSqlSchemaCommand implements Command {
private static final String DB_NAME = "postgres";
private static final String DB_USERNAME = "postgres";
private static final String DB_PASSWORD = "domain-registry";

View File

@@ -27,35 +27,32 @@ import static org.mockito.Mockito.when;
import com.google.common.collect.ImmutableMap;
import google.registry.model.ofy.CommitLogCheckpoint;
import google.registry.model.ofy.CommitLogCheckpointRoot;
import google.registry.testing.AppEngineRule;
import google.registry.testing.AppEngineExtension;
import google.registry.testing.FakeClock;
import google.registry.testing.TaskQueueHelper.TaskMatcher;
import google.registry.util.Retrier;
import google.registry.util.TaskQueueUtils;
import org.joda.time.DateTime;
import org.junit.Before;
import org.junit.Rule;
import org.junit.Test;
import org.junit.runner.RunWith;
import org.junit.runners.JUnit4;
import org.junit.jupiter.api.BeforeEach;
import org.junit.jupiter.api.Test;
import org.junit.jupiter.api.extension.RegisterExtension;
/** Unit tests for {@link CommitLogCheckpointAction}. */
@RunWith(JUnit4.class)
public class CommitLogCheckpointActionTest {
private static final String QUEUE_NAME = "export-commits";
@Rule
public final AppEngineRule appEngine =
AppEngineRule.builder().withDatastoreAndCloudSql().withTaskQueue().build();
@RegisterExtension
public final AppEngineExtension appEngine =
AppEngineExtension.builder().withDatastoreAndCloudSql().withTaskQueue().build();
CommitLogCheckpointStrategy strategy = mock(CommitLogCheckpointStrategy.class);
private CommitLogCheckpointStrategy strategy = mock(CommitLogCheckpointStrategy.class);
DateTime now = DateTime.now(UTC);
CommitLogCheckpointAction task = new CommitLogCheckpointAction();
private DateTime now = DateTime.now(UTC);
private CommitLogCheckpointAction task = new CommitLogCheckpointAction();
@Before
public void before() {
@BeforeEach
void beforeEach() {
task.clock = new FakeClock(now);
task.strategy = strategy;
task.taskQueueUtils = new TaskQueueUtils(new Retrier(null, 1));
@@ -66,7 +63,7 @@ public class CommitLogCheckpointActionTest {
}
@Test
public void testRun_noCheckpointEverWritten_writesCheckpointAndEnqueuesTask() {
void testRun_noCheckpointEverWritten_writesCheckpointAndEnqueuesTask() {
task.run();
assertTasksEnqueued(
QUEUE_NAME,
@@ -78,7 +75,7 @@ public class CommitLogCheckpointActionTest {
}
@Test
public void testRun_checkpointWrittenBeforeNow_writesCheckpointAndEnqueuesTask() {
void testRun_checkpointWrittenBeforeNow_writesCheckpointAndEnqueuesTask() {
DateTime oneMinuteAgo = now.minusMinutes(1);
persistResource(CommitLogCheckpointRoot.create(oneMinuteAgo));
task.run();
@@ -92,7 +89,7 @@ public class CommitLogCheckpointActionTest {
}
@Test
public void testRun_checkpointWrittenAfterNow_doesntOverwrite_orEnqueueTask() {
void testRun_checkpointWrittenAfterNow_doesntOverwrite_orEnqueueTask() {
DateTime oneMinuteFromNow = now.plusMinutes(1);
persistResource(CommitLogCheckpointRoot.create(oneMinuteFromNow));
task.run();

View File

@@ -31,32 +31,28 @@ import google.registry.model.ofy.Ofy;
import google.registry.model.registry.Registry;
import google.registry.persistence.transaction.TransactionManager;
import google.registry.schema.cursor.CursorDao;
import google.registry.testing.AppEngineRule;
import google.registry.testing.AppEngineExtension;
import google.registry.testing.FakeClock;
import google.registry.testing.InjectRule;
import google.registry.testing.InjectExtension;
import org.joda.time.DateTime;
import org.joda.time.Duration;
import org.junit.Before;
import org.junit.Rule;
import org.junit.Test;
import org.junit.runner.RunWith;
import org.junit.runners.JUnit4;
import org.junit.jupiter.api.BeforeEach;
import org.junit.jupiter.api.Test;
import org.junit.jupiter.api.extension.RegisterExtension;
/** Unit tests for {@link CommitLogCheckpointStrategy}. */
@RunWith(JUnit4.class)
public class CommitLogCheckpointStrategyTest {
@Rule
public final AppEngineRule appEngine = AppEngineRule.builder().withDatastoreAndCloudSql().build();
@RegisterExtension
public final AppEngineExtension appEngine =
AppEngineExtension.builder().withDatastoreAndCloudSql().build();
@Rule
public final InjectRule inject = new InjectRule();
@RegisterExtension public final InjectExtension inject = new InjectExtension();
final FakeClock clock = new FakeClock(DateTime.parse("2000-01-01TZ"));
final Ofy ofy = new Ofy(clock);
final TransactionManager tm = new DatastoreTransactionManager(ofy);
final CommitLogCheckpointStrategy strategy = new CommitLogCheckpointStrategy();
private final FakeClock clock = new FakeClock(DateTime.parse("2000-01-01TZ"));
private final Ofy ofy = new Ofy(clock);
private final TransactionManager tm = new DatastoreTransactionManager(ofy);
private final CommitLogCheckpointStrategy strategy = new CommitLogCheckpointStrategy();
/**
* Supplier to inject into CommitLogBucket for doling out predictable bucket IDs.
@@ -64,7 +60,7 @@ public class CommitLogCheckpointStrategyTest {
* <p>If not overridden, the supplier returns 1 so that other saves won't hit an NPE (since even
* if they use saveWithoutBackup() the transaction still selects a bucket key early).
*/
final FakeSupplier<Integer> fakeBucketIdSupplier = new FakeSupplier<>(1);
private final FakeSupplier<Integer> fakeBucketIdSupplier = new FakeSupplier<>(1);
/** Gross but necessary supplier that can be modified to return the desired value. */
private static class FakeSupplier<T> implements Supplier<T> {
@@ -74,7 +70,7 @@ public class CommitLogCheckpointStrategyTest {
/** Set this value field to make the supplier return this value. */
T value = null;
public FakeSupplier(T defaultValue) {
FakeSupplier(T defaultValue) {
this.defaultValue = defaultValue;
}
@@ -84,8 +80,8 @@ public class CommitLogCheckpointStrategyTest {
}
}
@Before
public void before() {
@BeforeEach
void beforeEach() {
strategy.clock = clock;
strategy.ofy = ofy;
@@ -102,13 +98,13 @@ public class CommitLogCheckpointStrategyTest {
}
@Test
public void test_readBucketTimestamps_noCommitLogs() {
void test_readBucketTimestamps_noCommitLogs() {
assertThat(strategy.readBucketTimestamps())
.containsExactly(1, START_OF_TIME, 2, START_OF_TIME, 3, START_OF_TIME);
}
@Test
public void test_readBucketTimestamps_withSomeCommitLogs() {
void test_readBucketTimestamps_withSomeCommitLogs() {
DateTime startTime = clock.nowUtc();
writeCommitLogToBucket(1);
clock.advanceOneMilli();
@@ -118,7 +114,7 @@ public class CommitLogCheckpointStrategyTest {
}
@Test
public void test_readBucketTimestamps_againAfterUpdate_reflectsUpdate() {
void test_readBucketTimestamps_againAfterUpdate_reflectsUpdate() {
DateTime firstTime = clock.nowUtc();
writeCommitLogToBucket(1);
writeCommitLogToBucket(2);
@@ -133,14 +129,14 @@ public class CommitLogCheckpointStrategyTest {
}
@Test
public void test_readNewCommitLogsAndFindThreshold_noCommitsAtAll_returnsEndOfTime() {
void test_readNewCommitLogsAndFindThreshold_noCommitsAtAll_returnsEndOfTime() {
ImmutableMap<Integer, DateTime> bucketTimes =
ImmutableMap.of(1, START_OF_TIME, 2, START_OF_TIME, 3, START_OF_TIME);
assertThat(strategy.readNewCommitLogsAndFindThreshold(bucketTimes)).isEqualTo(END_OF_TIME);
}
@Test
public void test_readNewCommitLogsAndFindThreshold_noNewCommits_returnsEndOfTime() {
void test_readNewCommitLogsAndFindThreshold_noNewCommits_returnsEndOfTime() {
DateTime now = clock.nowUtc();
writeCommitLogToBucket(1);
clock.advanceOneMilli();
@@ -153,7 +149,7 @@ public class CommitLogCheckpointStrategyTest {
}
@Test
public void test_readNewCommitLogsAndFindThreshold_tiedNewCommits_returnsCommitTimeMinusOne() {
void test_readNewCommitLogsAndFindThreshold_tiedNewCommits_returnsCommitTimeMinusOne() {
DateTime now = clock.nowUtc();
writeCommitLogToBucket(1);
writeCommitLogToBucket(2);
@@ -164,7 +160,7 @@ public class CommitLogCheckpointStrategyTest {
}
@Test
public void test_readNewCommitLogsAndFindThreshold_someNewCommits_returnsEarliestTimeMinusOne() {
void test_readNewCommitLogsAndFindThreshold_someNewCommits_returnsEarliestTimeMinusOne() {
DateTime now = clock.nowUtc();
writeCommitLogToBucket(1); // 1A
writeCommitLogToBucket(2); // 2A
@@ -191,7 +187,7 @@ public class CommitLogCheckpointStrategyTest {
}
@Test
public void test_readNewCommitLogsAndFindThreshold_commitsAtBucketTimes() {
void test_readNewCommitLogsAndFindThreshold_commitsAtBucketTimes() {
DateTime now = clock.nowUtc();
ImmutableMap<Integer, DateTime> bucketTimes =
ImmutableMap.of(1, now.minusMillis(1), 2, now, 3, now.plusMillis(1));
@@ -199,7 +195,7 @@ public class CommitLogCheckpointStrategyTest {
}
@Test
public void test_computeBucketCheckpointTimes_earlyThreshold_setsEverythingToThreshold() {
void test_computeBucketCheckpointTimes_earlyThreshold_setsEverythingToThreshold() {
DateTime now = clock.nowUtc();
ImmutableMap<Integer, DateTime> bucketTimes =
ImmutableMap.of(1, now.minusMillis(1), 2, now, 3, now.plusMillis(1));
@@ -208,7 +204,7 @@ public class CommitLogCheckpointStrategyTest {
}
@Test
public void test_computeBucketCheckpointTimes_middleThreshold_clampsToThreshold() {
void test_computeBucketCheckpointTimes_middleThreshold_clampsToThreshold() {
DateTime now = clock.nowUtc();
ImmutableMap<Integer, DateTime> bucketTimes =
ImmutableMap.of(1, now.minusMillis(1), 2, now, 3, now.plusMillis(1));
@@ -217,7 +213,7 @@ public class CommitLogCheckpointStrategyTest {
}
@Test
public void test_computeBucketCheckpointTimes_lateThreshold_leavesBucketTimesAsIs() {
void test_computeBucketCheckpointTimes_lateThreshold_leavesBucketTimesAsIs() {
DateTime now = clock.nowUtc();
ImmutableMap<Integer, DateTime> bucketTimes =
ImmutableMap.of(1, now.minusMillis(1), 2, now, 3, now.plusMillis(1));
@@ -226,7 +222,7 @@ public class CommitLogCheckpointStrategyTest {
}
@Test
public void test_computeCheckpoint_noCommitsAtAll_bucketCheckpointTimesAreStartOfTime() {
void test_computeCheckpoint_noCommitsAtAll_bucketCheckpointTimesAreStartOfTime() {
assertThat(strategy.computeCheckpoint())
.isEqualTo(CommitLogCheckpoint.create(
clock.nowUtc(),
@@ -234,7 +230,7 @@ public class CommitLogCheckpointStrategyTest {
}
@Test
public void test_computeCheckpoint_noNewCommitLogs_bucketCheckpointTimesAreBucketTimes() {
void test_computeCheckpoint_noNewCommitLogs_bucketCheckpointTimesAreBucketTimes() {
DateTime now = clock.nowUtc();
writeCommitLogToBucket(1);
clock.advanceOneMilli();
@@ -250,7 +246,7 @@ public class CommitLogCheckpointStrategyTest {
}
@Test
public void test_computeCheckpoint_someNewCommits_bucketCheckpointTimesAreClampedToThreshold() {
void test_computeCheckpoint_someNewCommits_bucketCheckpointTimesAreClampedToThreshold() {
DateTime now = clock.nowUtc();
writeCommitLogToBucket(1); // 1A
writeCommitLogToBucket(2); // 2A

View File

@@ -25,18 +25,15 @@ import google.registry.model.ofy.Ofy;
import google.registry.testing.DatastoreHelper;
import google.registry.testing.FakeClock;
import google.registry.testing.FakeResponse;
import google.registry.testing.InjectRule;
import google.registry.testing.InjectExtension;
import google.registry.testing.mapreduce.MapreduceTestCase;
import org.joda.time.DateTime;
import org.joda.time.Duration;
import org.junit.Before;
import org.junit.Rule;
import org.junit.Test;
import org.junit.runner.RunWith;
import org.junit.runners.JUnit4;
import org.junit.jupiter.api.BeforeEach;
import org.junit.jupiter.api.Test;
import org.junit.jupiter.api.extension.RegisterExtension;
/** Unit tests for {@link DeleteOldCommitLogsAction}. */
@RunWith(JUnit4.class)
public class DeleteOldCommitLogsActionTest
extends MapreduceTestCase<DeleteOldCommitLogsAction> {
@@ -44,11 +41,10 @@ public class DeleteOldCommitLogsActionTest
private final FakeResponse response = new FakeResponse();
private ContactResource contact;
@Rule
public final InjectRule inject = new InjectRule();
@RegisterExtension public final InjectExtension inject = new InjectExtension();
@Before
public void setup() {
@BeforeEach
void beforeEach() {
inject.setStaticField(Ofy.class, "clock", clock);
action = new DeleteOldCommitLogsAction();
action.mrRunner = makeDefaultRunner();
@@ -107,11 +103,9 @@ public class DeleteOldCommitLogsActionTest
return ImmutableList.copyOf(ofy().load().type(clazz).iterable());
}
/**
* Check that with very short maxAge, only the referenced elements remain.
*/
/** Check that with very short maxAge, only the referenced elements remain. */
@Test
public void test_shortMaxAge() throws Exception {
void test_shortMaxAge() throws Exception {
runMapreduce(Duration.millis(1));
assertThat(ImmutableList.copyOf(ofy().load().type(CommitLogManifest.class).keys().iterable()))
@@ -121,11 +115,9 @@ public class DeleteOldCommitLogsActionTest
assertThat(ofyLoadType(CommitLogMutation.class)).hasSize(contact.getRevisions().size() * 3);
}
/**
* Check that with very long maxAge, all the elements remain.
*/
/** Check that with very long maxAge, all the elements remain. */
@Test
public void test_longMaxAge() throws Exception {
void test_longMaxAge() throws Exception {
ImmutableList<CommitLogManifest> initialManifests = ofyLoadType(CommitLogManifest.class);
ImmutableList<CommitLogMutation> initialMutations = ofyLoadType(CommitLogMutation.class);

View File

@@ -34,24 +34,21 @@ import google.registry.model.ofy.CommitLogBucket;
import google.registry.model.ofy.CommitLogCheckpoint;
import google.registry.model.ofy.CommitLogManifest;
import google.registry.model.ofy.CommitLogMutation;
import google.registry.testing.AppEngineRule;
import google.registry.testing.AppEngineExtension;
import google.registry.testing.GcsTestingUtils;
import google.registry.testing.TestObject;
import java.util.List;
import org.joda.time.DateTime;
import org.junit.Before;
import org.junit.Rule;
import org.junit.Test;
import org.junit.runner.RunWith;
import org.junit.runners.JUnit4;
import org.junit.jupiter.api.BeforeEach;
import org.junit.jupiter.api.Test;
import org.junit.jupiter.api.extension.RegisterExtension;
/** Unit tests for {@link ExportCommitLogDiffAction}. */
@RunWith(JUnit4.class)
public class ExportCommitLogDiffActionTest {
@Rule
public final AppEngineRule appEngine =
AppEngineRule.builder()
@RegisterExtension
public final AppEngineExtension appEngine =
AppEngineExtension.builder()
.withDatastoreAndCloudSql()
.withOfyTestEntities(TestObject.class)
.build();
@@ -64,15 +61,15 @@ public class ExportCommitLogDiffActionTest {
private final ExportCommitLogDiffAction task = new ExportCommitLogDiffAction();
@Before
public void before() {
@BeforeEach
void beforeEach() {
task.gcsService = gcsService;
task.gcsBucket = "gcs bucket";
task.batchSize = 5;
}
@Test
public void testRun_noCommitHistory_onlyUpperCheckpointExported() throws Exception {
void testRun_noCommitHistory_onlyUpperCheckpointExported() throws Exception {
task.lowerCheckpointTime = oneMinuteAgo;
task.upperCheckpointTime = now;
@@ -104,7 +101,7 @@ public class ExportCommitLogDiffActionTest {
}
@Test
public void testRun_regularCommitHistory_exportsCorrectCheckpointDiff() throws Exception {
void testRun_regularCommitHistory_exportsCorrectCheckpointDiff() throws Exception {
task.lowerCheckpointTime = oneMinuteAgo;
task.upperCheckpointTime = now;
@@ -175,7 +172,7 @@ public class ExportCommitLogDiffActionTest {
}
@Test
public void testRun_simultaneousTransactions_bothExported() throws Exception {
void testRun_simultaneousTransactions_bothExported() throws Exception {
task.lowerCheckpointTime = oneMinuteAgo;
task.upperCheckpointTime = now;
@@ -227,7 +224,7 @@ public class ExportCommitLogDiffActionTest {
}
@Test
public void testRun_exportsAcrossMultipleBatches() throws Exception {
void testRun_exportsAcrossMultipleBatches() throws Exception {
task.batchSize = 2;
task.lowerCheckpointTime = oneMinuteAgo;
task.upperCheckpointTime = now;
@@ -288,7 +285,7 @@ public class ExportCommitLogDiffActionTest {
}
@Test
public void testRun_checkpointDiffWithNeverTouchedBuckets_exportsCorrectly() throws Exception {
void testRun_checkpointDiffWithNeverTouchedBuckets_exportsCorrectly() throws Exception {
task.lowerCheckpointTime = oneMinuteAgo;
task.upperCheckpointTime = now;
@@ -322,8 +319,7 @@ public class ExportCommitLogDiffActionTest {
}
@Test
public void testRun_checkpointDiffWithNonExistentBucketTimestamps_exportsCorrectly()
throws Exception {
void testRun_checkpointDiffWithNonExistentBucketTimestamps_exportsCorrectly() throws Exception {
// Non-existent bucket timestamps can exist when the commit log bucket count was increased
// recently.
@@ -404,7 +400,7 @@ public class ExportCommitLogDiffActionTest {
}
@Test
public void testRun_exportingFromStartOfTime_exportsAllCommits() throws Exception {
void testRun_exportingFromStartOfTime_exportsAllCommits() throws Exception {
task.lowerCheckpointTime = START_OF_TIME;
task.upperCheckpointTime = now;

View File

@@ -34,7 +34,7 @@ import com.google.appengine.tools.cloudstorage.ListResult;
import com.google.common.collect.Iterators;
import com.google.common.flogger.LoggerConfig;
import com.google.common.testing.TestLogHandler;
import google.registry.testing.AppEngineRule;
import google.registry.testing.AppEngineExtension;
import java.io.IOException;
import java.lang.reflect.InvocationHandler;
import java.lang.reflect.Method;
@@ -44,28 +44,26 @@ import java.util.List;
import java.util.concurrent.Callable;
import java.util.logging.LogRecord;
import org.joda.time.DateTime;
import org.junit.Before;
import org.junit.Rule;
import org.junit.Test;
import org.junit.runner.RunWith;
import org.junit.runners.JUnit4;
import org.junit.jupiter.api.BeforeEach;
import org.junit.jupiter.api.Test;
import org.junit.jupiter.api.extension.RegisterExtension;
/** Unit tests for {@link GcsDiffFileLister}. */
@RunWith(JUnit4.class)
public class GcsDiffFileListerTest {
static final String GCS_BUCKET = "gcs bucket";
private static final String GCS_BUCKET = "gcs bucket";
final DateTime now = DateTime.now(UTC);
final GcsDiffFileLister diffLister = new GcsDiffFileLister();
final GcsService gcsService = GcsServiceFactory.createGcsService();
private final DateTime now = DateTime.now(UTC);
private final GcsDiffFileLister diffLister = new GcsDiffFileLister();
private final GcsService gcsService = GcsServiceFactory.createGcsService();
private final TestLogHandler logHandler = new TestLogHandler();
@Rule
public final AppEngineRule appEngine = AppEngineRule.builder().withDatastoreAndCloudSql().build();
@RegisterExtension
public final AppEngineExtension appEngine =
AppEngineExtension.builder().withDatastoreAndCloudSql().build();
@Before
public void before() throws Exception {
@BeforeEach
void beforeEach() throws Exception {
diffLister.gcsService = gcsService;
diffLister.gcsBucket = GCS_BUCKET;
diffLister.executor = newDirectExecutorService();
@@ -111,13 +109,13 @@ public class GcsDiffFileListerTest {
}
@Test
public void testList_noFilesFound() {
void testList_noFilesFound() {
DateTime fromTime = now.plusMillis(1);
assertThat(listDiffFiles(fromTime, null)).isEmpty();
}
@Test
public void testList_patchesHoles() {
void testList_patchesHoles() {
// Fake out the GCS list() method to return only the first and last file.
// We can't use Mockito.spy() because GcsService's impl is final.
diffLister.gcsService = (GcsService) newProxyInstance(
@@ -162,7 +160,7 @@ public class GcsDiffFileListerTest {
}
@Test
public void testList_failsOnFork() throws Exception {
void testList_failsOnFork() throws Exception {
// We currently have files for now-4m ... now, construct the following sequence:
// now-8m <- now-7m <- now-6m now-5m <- now-4m ... now
// ^___________________________|
@@ -179,7 +177,7 @@ public class GcsDiffFileListerTest {
}
@Test
public void testList_boundaries() {
void testList_boundaries() {
assertThat(listDiffFiles(now.minusMinutes(4), now))
.containsExactly(
now.minusMinutes(4),
@@ -192,7 +190,7 @@ public class GcsDiffFileListerTest {
}
@Test
public void testList_failsOnGaps() throws Exception {
void testList_failsOnGaps() throws Exception {
// We currently have files for now-4m ... now, construct the following sequence:
// now-8m <- now-7m <- now-6m {missing} <- now-4m ... now
for (int i = 6; i < 9; ++i) {
@@ -228,7 +226,7 @@ public class GcsDiffFileListerTest {
}
@Test
public void testList_toTimeSpecified() {
void testList_toTimeSpecified() {
assertThat(listDiffFiles(
now.minusMinutes(4).minusSeconds(1), now.minusMinutes(2).plusSeconds(1)))
.containsExactly(

View File

@@ -42,7 +42,7 @@ import google.registry.model.ofy.CommitLogCheckpoint;
import google.registry.model.ofy.CommitLogCheckpointRoot;
import google.registry.model.ofy.CommitLogManifest;
import google.registry.model.ofy.CommitLogMutation;
import google.registry.testing.AppEngineRule;
import google.registry.testing.AppEngineExtension;
import google.registry.testing.FakeClock;
import google.registry.testing.FakeSleeper;
import google.registry.testing.TestObject;
@@ -54,31 +54,28 @@ import java.util.List;
import java.util.Map;
import java.util.Map.Entry;
import org.joda.time.DateTime;
import org.junit.Before;
import org.junit.Rule;
import org.junit.Test;
import org.junit.runner.RunWith;
import org.junit.runners.JUnit4;
import org.junit.jupiter.api.BeforeEach;
import org.junit.jupiter.api.Test;
import org.junit.jupiter.api.extension.RegisterExtension;
/** Unit tests for {@link RestoreCommitLogsAction}. */
@RunWith(JUnit4.class)
public class RestoreCommitLogsActionTest {
static final String GCS_BUCKET = "gcs bucket";
private static final String GCS_BUCKET = "gcs bucket";
final DateTime now = DateTime.now(UTC);
final RestoreCommitLogsAction action = new RestoreCommitLogsAction();
final GcsService gcsService = createGcsService();
private final DateTime now = DateTime.now(UTC);
private final RestoreCommitLogsAction action = new RestoreCommitLogsAction();
private final GcsService gcsService = createGcsService();
@Rule
public final AppEngineRule appEngine =
AppEngineRule.builder()
@RegisterExtension
public final AppEngineExtension appEngine =
AppEngineExtension.builder()
.withDatastoreAndCloudSql()
.withOfyTestEntities(TestObject.class)
.build();
@Before
public void init() {
@BeforeEach
void beforeEach() {
action.gcsService = gcsService;
action.dryRun = false;
action.datastoreService = DatastoreServiceFactory.getDatastoreService();
@@ -91,7 +88,7 @@ public class RestoreCommitLogsActionTest {
}
@Test
public void testRestore_multipleDiffFiles() throws Exception {
void testRestore_multipleDiffFiles() throws Exception {
ofy().saveWithoutBackup().entities(
TestObject.create("previous to keep"),
TestObject.create("previous to delete")).now();
@@ -141,7 +138,7 @@ public class RestoreCommitLogsActionTest {
}
@Test
public void testRestore_noManifests() throws Exception {
void testRestore_noManifests() throws Exception {
ofy().saveWithoutBackup().entity(
TestObject.create("previous to keep")).now();
saveDiffFileNotToRestore(now.minusMinutes(1));
@@ -155,7 +152,7 @@ public class RestoreCommitLogsActionTest {
}
@Test
public void testRestore_manifestWithNoDeletions() throws Exception {
void testRestore_manifestWithNoDeletions() throws Exception {
ofy().saveWithoutBackup().entity(TestObject.create("previous to keep")).now();
Key<CommitLogBucket> bucketKey = getBucketKey(1);
Key<CommitLogManifest> manifestKey = CommitLogManifest.createKey(bucketKey, now);
@@ -174,7 +171,7 @@ public class RestoreCommitLogsActionTest {
}
@Test
public void testRestore_manifestWithNoMutations() throws Exception {
void testRestore_manifestWithNoMutations() throws Exception {
ofy().saveWithoutBackup().entities(
TestObject.create("previous to keep"),
TestObject.create("previous to delete")).now();
@@ -195,7 +192,7 @@ public class RestoreCommitLogsActionTest {
// This is a pathological case that shouldn't be possible, but we should be robust to it.
@Test
public void testRestore_manifestWithNoMutationsOrDeletions() throws Exception {
void testRestore_manifestWithNoMutationsOrDeletions() throws Exception {
ofy().saveWithoutBackup().entities(
TestObject.create("previous to keep")).now();
saveDiffFileNotToRestore(now.minusMinutes(1));
@@ -211,7 +208,7 @@ public class RestoreCommitLogsActionTest {
}
@Test
public void testRestore_mutateExistingEntity() throws Exception {
void testRestore_mutateExistingEntity() throws Exception {
ofy().saveWithoutBackup().entity(TestObject.create("existing", "a")).now();
Key<CommitLogManifest> manifestKey = CommitLogManifest.createKey(getBucketKey(1), now);
saveDiffFileNotToRestore(now.minusMinutes(1));
@@ -229,7 +226,7 @@ public class RestoreCommitLogsActionTest {
// This should be harmless; deletes are idempotent.
@Test
public void testRestore_deleteMissingEntity() throws Exception {
void testRestore_deleteMissingEntity() throws Exception {
ofy().saveWithoutBackup().entity(TestObject.create("previous to keep", "a")).now();
saveDiffFileNotToRestore(now.minusMinutes(1));
Iterable<ImmutableObject> commitLogs = saveDiffFile(

View File

@@ -39,10 +39,10 @@ import com.google.common.flogger.LoggerConfig;
import com.googlecode.objectify.Key;
import google.registry.model.contact.ContactResource;
import google.registry.schema.domain.RegistryLock;
import google.registry.testing.AppEngineRule;
import google.registry.testing.AppEngineExtension;
import google.registry.testing.FakeClock;
import google.registry.testing.FakeSleeper;
import google.registry.testing.InjectRule;
import google.registry.testing.InjectExtension;
import google.registry.testing.TaskQueueHelper.TaskMatcher;
import google.registry.util.AppEngineServiceUtils;
import google.registry.util.CapturingLogHandler;
@@ -50,26 +50,24 @@ import google.registry.util.Retrier;
import java.util.logging.Level;
import org.joda.time.DateTime;
import org.joda.time.Duration;
import org.junit.Before;
import org.junit.Rule;
import org.junit.Test;
import org.junit.runner.RunWith;
import org.junit.runners.JUnit4;
import org.junit.jupiter.api.BeforeEach;
import org.junit.jupiter.api.Test;
import org.junit.jupiter.api.extension.ExtendWith;
import org.junit.jupiter.api.extension.RegisterExtension;
import org.mockito.Mock;
import org.mockito.junit.MockitoJUnit;
import org.mockito.junit.MockitoRule;
import org.mockito.junit.jupiter.MockitoExtension;
import org.mockito.junit.jupiter.MockitoSettings;
import org.mockito.quality.Strictness;
/** Unit tests for {@link AsyncTaskEnqueuer}. */
@RunWith(JUnit4.class)
@ExtendWith(MockitoExtension.class)
public class AsyncTaskEnqueuerTest {
@Rule
public final AppEngineRule appEngine =
AppEngineRule.builder().withDatastoreAndCloudSql().withTaskQueue().build();
@RegisterExtension
public final AppEngineExtension appEngine =
AppEngineExtension.builder().withDatastoreAndCloudSql().withTaskQueue().build();
@Rule public final InjectRule inject = new InjectRule();
@Rule public final MockitoRule mocks = MockitoJUnit.rule();
@RegisterExtension public final InjectExtension inject = new InjectExtension();
@Mock private AppEngineServiceUtils appEngineServiceUtils;
@@ -77,8 +75,8 @@ public class AsyncTaskEnqueuerTest {
private final CapturingLogHandler logHandler = new CapturingLogHandler();
private final FakeClock clock = new FakeClock(DateTime.parse("2015-05-18T12:34:56Z"));
@Before
public void setUp() {
@BeforeEach
void beforeEach() {
LoggerConfig.getConfig(AsyncTaskEnqueuer.class).addHandler(logHandler);
when(appEngineServiceUtils.getServiceHostname("backend")).thenReturn("backend.hostname.fake");
asyncTaskEnqueuer = createForTesting(appEngineServiceUtils, clock, standardSeconds(90));
@@ -96,7 +94,7 @@ public class AsyncTaskEnqueuerTest {
}
@Test
public void test_enqueueAsyncResave_success() {
void test_enqueueAsyncResave_success() {
ContactResource contact = persistActiveContact("jd23456");
asyncTaskEnqueuer.enqueueAsyncResave(contact, clock.nowUtc(), clock.nowUtc().plusDays(5));
assertTasksEnqueued(
@@ -114,7 +112,7 @@ public class AsyncTaskEnqueuerTest {
}
@Test
public void test_enqueueAsyncResave_multipleResaves() {
void test_enqueueAsyncResave_multipleResaves() {
ContactResource contact = persistActiveContact("jd23456");
DateTime now = clock.nowUtc();
asyncTaskEnqueuer.enqueueAsyncResave(
@@ -130,16 +128,15 @@ public class AsyncTaskEnqueuerTest {
.header("content-type", "application/x-www-form-urlencoded")
.param(PARAM_RESOURCE_KEY, Key.create(contact).getString())
.param(PARAM_REQUESTED_TIME, now.toString())
.param(
PARAM_RESAVE_TIMES,
"2015-05-20T14:34:56.000Z,2015-05-21T15:34:56.000Z")
.param(PARAM_RESAVE_TIMES, "2015-05-20T14:34:56.000Z,2015-05-21T15:34:56.000Z")
.etaDelta(
standardHours(24).minus(standardSeconds(30)),
standardHours(24).plus(standardSeconds(30))));
}
@MockitoSettings(strictness = Strictness.LENIENT)
@Test
public void test_enqueueAsyncResave_ignoresTasksTooFarIntoFuture() throws Exception {
void test_enqueueAsyncResave_ignoresTasksTooFarIntoFuture() throws Exception {
ContactResource contact = persistActiveContact("jd23456");
asyncTaskEnqueuer.enqueueAsyncResave(contact, clock.nowUtc(), clock.nowUtc().plusDays(31));
assertNoTasksEnqueued(QUEUE_ASYNC_ACTIONS);
@@ -147,7 +144,7 @@ public class AsyncTaskEnqueuerTest {
}
@Test
public void testEnqueueRelock() {
void testEnqueueRelock() {
RegistryLock lock =
saveRegistryLock(
new RegistryLock.Builder()
@@ -168,6 +165,7 @@ public class AsyncTaskEnqueuerTest {
new TaskMatcher()
.url(RelockDomainAction.PATH)
.method("POST")
.header("Host", "backend.hostname.fake")
.param(
RelockDomainAction.OLD_UNLOCK_REVISION_ID_PARAM,
String.valueOf(lock.getRevisionId()))
@@ -176,8 +174,9 @@ public class AsyncTaskEnqueuerTest {
standardHours(6).plus(standardSeconds(30))));
}
@MockitoSettings(strictness = Strictness.LENIENT)
@Test
public void testFailure_enqueueRelock_noDuration() {
void testFailure_enqueueRelock_noDuration() {
RegistryLock lockWithoutDuration =
saveRegistryLock(
new RegistryLock.Builder()

View File

@@ -21,19 +21,16 @@ import static google.registry.batch.AsyncTaskMetrics.OperationType.CONTACT_AND_H
import com.google.common.collect.ImmutableSet;
import google.registry.testing.FakeClock;
import org.junit.Test;
import org.junit.runner.RunWith;
import org.junit.runners.JUnit4;
import org.junit.jupiter.api.Test;
/** Unit tests for {@link AsyncTaskMetrics}. */
@RunWith(JUnit4.class)
public class AsyncTaskMetricsTest {
class AsyncTaskMetricsTest {
private final FakeClock clock = new FakeClock();
private final AsyncTaskMetrics asyncTaskMetrics = new AsyncTaskMetrics(clock);
@Test
public void testRecordAsyncFlowResult_calculatesDurationMillisCorrectly() {
void testRecordAsyncFlowResult_calculatesDurationMillisCorrectly() {
asyncTaskMetrics.recordAsyncFlowResult(
CONTACT_AND_HOST_DELETE,
SUCCESS,

View File

@@ -94,7 +94,7 @@ import google.registry.model.transfer.TransferStatus;
import google.registry.testing.FakeClock;
import google.registry.testing.FakeResponse;
import google.registry.testing.FakeSleeper;
import google.registry.testing.InjectRule;
import google.registry.testing.InjectExtension;
import google.registry.testing.TaskQueueHelper.TaskMatcher;
import google.registry.testing.mapreduce.MapreduceTestCase;
import google.registry.util.AppEngineServiceUtils;
@@ -105,19 +105,16 @@ import google.registry.util.SystemSleeper;
import java.util.Optional;
import org.joda.time.DateTime;
import org.joda.time.Duration;
import org.junit.Before;
import org.junit.Rule;
import org.junit.Test;
import org.junit.runner.RunWith;
import org.junit.runners.JUnit4;
import org.junit.jupiter.api.BeforeEach;
import org.junit.jupiter.api.Test;
import org.junit.jupiter.api.extension.RegisterExtension;
import org.mockito.Mock;
/** Unit tests for {@link DeleteContactsAndHostsAction}. */
@RunWith(JUnit4.class)
public class DeleteContactsAndHostsActionTest
extends MapreduceTestCase<DeleteContactsAndHostsAction> {
@Rule public final InjectRule inject = new InjectRule();
@RegisterExtension public final InjectExtension inject = new InjectExtension();
private AsyncTaskEnqueuer enqueuer;
private final FakeClock clock = new FakeClock(DateTime.parse("2015-01-15T11:22:33Z"));
@@ -146,8 +143,8 @@ public class DeleteContactsAndHostsActionTest
ofy().clearSessionCache();
}
@Before
public void setup() {
@BeforeEach
void beforeEach() {
inject.setStaticField(Ofy.class, "clock", clock);
enqueuer =
AsyncTaskEnqueuerTest.createForTesting(
@@ -171,7 +168,7 @@ public class DeleteContactsAndHostsActionTest
}
@Test
public void testSuccess_contact_referencedByActiveDomain_doesNotGetDeleted() throws Exception {
void testSuccess_contact_referencedByActiveDomain_doesNotGetDeleted() throws Exception {
ContactResource contact = persistContactPendingDelete("blah8221");
persistResource(newDomainBase("example.tld", contact));
DateTime timeEnqueued = clock.nowUtc();
@@ -211,17 +208,17 @@ public class DeleteContactsAndHostsActionTest
}
@Test
public void testSuccess_contact_notReferenced_getsDeleted_andPiiWipedOut() throws Exception {
void testSuccess_contact_notReferenced_getsDeleted_andPiiWipedOut() throws Exception {
runSuccessfulContactDeletionTest(Optional.of("fakeClientTrid"));
}
@Test
public void testSuccess_contact_andNoClientTrid_deletesSuccessfully() throws Exception {
void testSuccess_contact_andNoClientTrid_deletesSuccessfully() throws Exception {
runSuccessfulContactDeletionTest(Optional.empty());
}
@Test
public void test_cannotAcquireLock() {
void test_cannotAcquireLock() {
// Make lock acquisition fail.
acquireLock();
enqueueMapreduceOnly();
@@ -229,7 +226,7 @@ public class DeleteContactsAndHostsActionTest
}
@Test
public void test_mapreduceHasWorkToDo_lockIsAcquired() {
void test_mapreduceHasWorkToDo_lockIsAcquired() {
ContactResource contact = persistContactPendingDelete("blah8221");
persistResource(newDomainBase("example.tld", contact));
DateTime timeEnqueued = clock.nowUtc();
@@ -244,7 +241,7 @@ public class DeleteContactsAndHostsActionTest
}
@Test
public void test_noTasksToLease_releasesLockImmediately() {
void test_noTasksToLease_releasesLockImmediately() {
enqueueMapreduceOnly();
// If the Lock was correctly released, then we can acquire it now.
assertThat(acquireLock()).isPresent();
@@ -293,8 +290,7 @@ public class DeleteContactsAndHostsActionTest
}
@Test
public void testSuccess_contactWithoutPendingTransfer_isDeletedAndHasNoTransferData()
throws Exception {
void testSuccess_contactWithoutPendingTransfer_isDeletedAndHasNoTransferData() throws Exception {
ContactResource contact = persistContactPendingDelete("blah8221");
enqueuer.enqueueAsyncDelete(
contact,
@@ -308,7 +304,7 @@ public class DeleteContactsAndHostsActionTest
}
@Test
public void testSuccess_contactWithPendingTransfer_getsDeleted() throws Exception {
void testSuccess_contactWithPendingTransfer_getsDeleted() throws Exception {
DateTime transferRequestTime = clock.nowUtc().minusDays(3);
ContactResource contact =
persistContactWithPendingTransfer(
@@ -371,7 +367,7 @@ public class DeleteContactsAndHostsActionTest
}
@Test
public void testSuccess_contact_referencedByDeletedDomain_getsDeleted() throws Exception {
void testSuccess_contact_referencedByDeletedDomain_getsDeleted() throws Exception {
ContactResource contactUsed = persistContactPendingDelete("blah1234");
persistResource(
newDomainBase("example.tld", contactUsed)
@@ -410,7 +406,7 @@ public class DeleteContactsAndHostsActionTest
}
@Test
public void testSuccess_contact_notRequestedByOwner_doesNotGetDeleted() throws Exception {
void testSuccess_contact_notRequestedByOwner_doesNotGetDeleted() throws Exception {
ContactResource contact = persistContactPendingDelete("jane0991");
enqueuer.enqueueAsyncDelete(
contact,
@@ -438,7 +434,7 @@ public class DeleteContactsAndHostsActionTest
}
@Test
public void testSuccess_contact_notRequestedByOwner_isSuperuser_getsDeleted() throws Exception {
void testSuccess_contact_notRequestedByOwner_isSuperuser_getsDeleted() throws Exception {
ContactResource contact = persistContactWithPii("nate007");
enqueuer.enqueueAsyncDelete(
contact,
@@ -480,7 +476,7 @@ public class DeleteContactsAndHostsActionTest
}
@Test
public void testSuccess_targetResourcesDontExist_areDelayedForADay() throws Exception {
void testSuccess_targetResourcesDontExist_areDelayedForADay() throws Exception {
ContactResource contactNotSaved = newContactResource("somecontact");
HostResource hostNotSaved = newHostResource("a11.blah.foo");
DateTime timeBeforeRun = clock.nowUtc();
@@ -519,7 +515,7 @@ public class DeleteContactsAndHostsActionTest
}
@Test
public void testSuccess_unparseableTasks_areDelayedForADay() throws Exception {
void testSuccess_unparseableTasks_areDelayedForADay() throws Exception {
TaskOptions task =
TaskOptions.Builder.withMethod(Method.PULL).param("gobbledygook", "kljhadfgsd9f7gsdfh");
getQueue(QUEUE_ASYNC_DELETE).add(task);
@@ -535,7 +531,7 @@ public class DeleteContactsAndHostsActionTest
}
@Test
public void testSuccess_resourcesNotInPendingDelete_areSkipped() throws Exception {
void testSuccess_resourcesNotInPendingDelete_areSkipped() throws Exception {
ContactResource contact = persistActiveContact("blah2222");
HostResource host = persistActiveHost("rustles.your.jimmies");
DateTime timeEnqueued = clock.nowUtc();
@@ -567,7 +563,7 @@ public class DeleteContactsAndHostsActionTest
}
@Test
public void testSuccess_alreadyDeletedResources_areSkipped() throws Exception {
void testSuccess_alreadyDeletedResources_areSkipped() throws Exception {
ContactResource contactDeleted = persistDeletedContact("blah1236", clock.nowUtc().minusDays(2));
HostResource hostDeleted = persistDeletedHost("a.lim.lop", clock.nowUtc().minusDays(3));
enqueuer.enqueueAsyncDelete(
@@ -590,7 +586,7 @@ public class DeleteContactsAndHostsActionTest
}
@Test
public void testSuccess_host_referencedByActiveDomain_doesNotGetDeleted() throws Exception {
void testSuccess_host_referencedByActiveDomain_doesNotGetDeleted() throws Exception {
HostResource host = persistHostPendingDelete("ns1.example.tld");
persistUsedDomain("example.tld", persistActiveContact("abc456"), host);
DateTime timeEnqueued = clock.nowUtc();
@@ -627,12 +623,12 @@ public class DeleteContactsAndHostsActionTest
}
@Test
public void testSuccess_host_notReferenced_getsDeleted() throws Exception {
void testSuccess_host_notReferenced_getsDeleted() throws Exception {
runSuccessfulHostDeletionTest(Optional.of("fakeClientTrid"));
}
@Test
public void testSuccess_host_andNoClientTrid_deletesSuccessfully() throws Exception {
void testSuccess_host_andNoClientTrid_deletesSuccessfully() throws Exception {
runSuccessfulHostDeletionTest(Optional.empty());
}
@@ -675,7 +671,7 @@ public class DeleteContactsAndHostsActionTest
}
@Test
public void testSuccess_host_referencedByDeletedDomain_getsDeleted() throws Exception {
void testSuccess_host_referencedByDeletedDomain_getsDeleted() throws Exception {
HostResource host = persistHostPendingDelete("ns1.example.tld");
persistResource(
newDomainBase("example.tld")
@@ -715,7 +711,7 @@ public class DeleteContactsAndHostsActionTest
}
@Test
public void testSuccess_subordinateHost_getsDeleted() throws Exception {
void testSuccess_subordinateHost_getsDeleted() throws Exception {
DomainBase domain =
persistResource(
newDomainBase("example.tld")
@@ -766,7 +762,7 @@ public class DeleteContactsAndHostsActionTest
}
@Test
public void testSuccess_host_notRequestedByOwner_doesNotGetDeleted() throws Exception {
void testSuccess_host_notRequestedByOwner_doesNotGetDeleted() throws Exception {
HostResource host = persistHostPendingDelete("ns2.example.tld");
enqueuer.enqueueAsyncDelete(
host,
@@ -794,7 +790,7 @@ public class DeleteContactsAndHostsActionTest
}
@Test
public void testSuccess_host_notRequestedByOwner_isSuperuser_getsDeleted() throws Exception {
void testSuccess_host_notRequestedByOwner_isSuperuser_getsDeleted() throws Exception {
HostResource host = persistHostPendingDelete("ns66.example.tld");
enqueuer.enqueueAsyncDelete(
host,
@@ -828,7 +824,7 @@ public class DeleteContactsAndHostsActionTest
}
@Test
public void testSuccess_deleteABunchOfContactsAndHosts_butNotSome() throws Exception {
void testSuccess_deleteABunchOfContactsAndHosts_butNotSome() throws Exception {
ContactResource c1 = persistContactPendingDelete("nsaid54");
ContactResource c2 = persistContactPendingDelete("nsaid55");
ContactResource c3 = persistContactPendingDelete("nsaid57");

View File

@@ -46,29 +46,26 @@ import google.registry.model.registry.Registry;
import google.registry.model.registry.Registry.TldType;
import google.registry.model.reporting.HistoryEntry;
import google.registry.testing.FakeResponse;
import google.registry.testing.SystemPropertyRule;
import google.registry.testing.SystemPropertyExtension;
import google.registry.testing.mapreduce.MapreduceTestCase;
import java.util.Optional;
import java.util.Set;
import org.joda.money.Money;
import org.joda.time.DateTime;
import org.junit.Before;
import org.junit.Rule;
import org.junit.Test;
import org.junit.runner.RunWith;
import org.junit.runners.JUnit4;
import org.junit.jupiter.api.BeforeEach;
import org.junit.jupiter.api.Test;
import org.junit.jupiter.api.extension.RegisterExtension;
/** Unit tests for {@link DeleteProberDataAction}. */
@RunWith(JUnit4.class)
public class DeleteProberDataActionTest extends MapreduceTestCase<DeleteProberDataAction> {
class DeleteProberDataActionTest extends MapreduceTestCase<DeleteProberDataAction> {
private static final DateTime DELETION_TIME = DateTime.parse("2010-01-01T00:00:00.000Z");
@Rule
public final SystemPropertyRule systemPropertyRule = new SystemPropertyRule();
@RegisterExtension
final SystemPropertyExtension systemPropertyExtension = new SystemPropertyExtension();
@Before
public void init() {
@BeforeEach
void beforeEach() {
// Entities in these two should not be touched.
createTld("tld", "TLD");
// Since "example" doesn't end with .test, its entities won't be deleted even though it is of
@@ -96,7 +93,7 @@ public class DeleteProberDataActionTest extends MapreduceTestCase<DeleteProberDa
action.isDryRun = false;
action.tlds = ImmutableSet.of();
action.registryAdminClientId = "TheRegistrar";
RegistryEnvironment.SANDBOX.setup(systemPropertyRule);
RegistryEnvironment.SANDBOX.setup(systemPropertyExtension);
}
private void runMapreduce() throws Exception {
@@ -105,7 +102,7 @@ public class DeleteProberDataActionTest extends MapreduceTestCase<DeleteProberDa
}
@Test
public void test_deletesAllAndOnlyProberData() throws Exception {
void test_deletesAllAndOnlyProberData() throws Exception {
Set<ImmutableObject> tldEntities = persistLotsOfDomains("tld");
Set<ImmutableObject> exampleEntities = persistLotsOfDomains("example");
Set<ImmutableObject> notTestEntities = persistLotsOfDomains("not-test.test");
@@ -120,7 +117,7 @@ public class DeleteProberDataActionTest extends MapreduceTestCase<DeleteProberDa
}
@Test
public void testSuccess_deletesAllAndOnlyGivenTlds() throws Exception {
void testSuccess_deletesAllAndOnlyGivenTlds() throws Exception {
Set<ImmutableObject> tldEntities = persistLotsOfDomains("tld");
Set<ImmutableObject> exampleEntities = persistLotsOfDomains("example");
Set<ImmutableObject> notTestEntities = persistLotsOfDomains("not-test.test");
@@ -136,7 +133,7 @@ public class DeleteProberDataActionTest extends MapreduceTestCase<DeleteProberDa
}
@Test
public void testFail_givenNonTestTld() {
void testFail_givenNonTestTld() {
action.tlds = ImmutableSet.of("not-test.test");
IllegalArgumentException thrown =
assertThrows(IllegalArgumentException.class, this::runMapreduce);
@@ -146,7 +143,7 @@ public class DeleteProberDataActionTest extends MapreduceTestCase<DeleteProberDa
}
@Test
public void testFail_givenNonExistentTld() {
void testFail_givenNonExistentTld() {
action.tlds = ImmutableSet.of("non-existent.test");
IllegalArgumentException thrown =
assertThrows(IllegalArgumentException.class, this::runMapreduce);
@@ -156,9 +153,9 @@ public class DeleteProberDataActionTest extends MapreduceTestCase<DeleteProberDa
}
@Test
public void testFail_givenNonDotTestTldOnProd() {
void testFail_givenNonDotTestTldOnProd() {
action.tlds = ImmutableSet.of("example");
RegistryEnvironment.PRODUCTION.setup(systemPropertyRule);
RegistryEnvironment.PRODUCTION.setup(systemPropertyExtension);
IllegalArgumentException thrown =
assertThrows(IllegalArgumentException.class, this::runMapreduce);
assertThat(thrown)
@@ -167,7 +164,7 @@ public class DeleteProberDataActionTest extends MapreduceTestCase<DeleteProberDa
}
@Test
public void testSuccess_doesntDeleteNicDomainForProbers() throws Exception {
void testSuccess_doesntDeleteNicDomainForProbers() throws Exception {
DomainBase nic = persistActiveDomain("nic.ib-any.test");
ForeignKeyIndex<DomainBase> fkiNic =
ForeignKeyIndex.load(DomainBase.class, "nic.ib-any.test", START_OF_TIME);
@@ -178,7 +175,7 @@ public class DeleteProberDataActionTest extends MapreduceTestCase<DeleteProberDa
}
@Test
public void testDryRun_doesntDeleteData() throws Exception {
void testDryRun_doesntDeleteData() throws Exception {
Set<ImmutableObject> tldEntities = persistLotsOfDomains("tld");
Set<ImmutableObject> oaEntities = persistLotsOfDomains("oa-canary.test");
action.isDryRun = true;
@@ -188,7 +185,7 @@ public class DeleteProberDataActionTest extends MapreduceTestCase<DeleteProberDa
}
@Test
public void testSuccess_activeDomain_isSoftDeleted() throws Exception {
void testSuccess_activeDomain_isSoftDeleted() throws Exception {
DomainBase domain = persistResource(
newDomainBase("blah.ib-any.test")
.asBuilder()
@@ -203,7 +200,7 @@ public class DeleteProberDataActionTest extends MapreduceTestCase<DeleteProberDa
}
@Test
public void testSuccess_activeDomain_doubleMapSoftDeletes() throws Exception {
void testSuccess_activeDomain_doubleMapSoftDeletes() throws Exception {
DomainBase domain = persistResource(
newDomainBase("blah.ib-any.test")
.asBuilder()
@@ -220,7 +217,7 @@ public class DeleteProberDataActionTest extends MapreduceTestCase<DeleteProberDa
}
@Test
public void test_recentlyCreatedDomain_isntDeletedYet() throws Exception {
void test_recentlyCreatedDomain_isntDeletedYet() throws Exception {
persistResource(
newDomainBase("blah.ib-any.test")
.asBuilder()
@@ -234,7 +231,7 @@ public class DeleteProberDataActionTest extends MapreduceTestCase<DeleteProberDa
}
@Test
public void testDryRun_doesntSoftDeleteData() throws Exception {
void testDryRun_doesntSoftDeleteData() throws Exception {
DomainBase domain = persistResource(
newDomainBase("blah.ib-any.test")
.asBuilder()
@@ -246,7 +243,7 @@ public class DeleteProberDataActionTest extends MapreduceTestCase<DeleteProberDa
}
@Test
public void test_domainWithSubordinateHosts_isSkipped() throws Exception {
void test_domainWithSubordinateHosts_isSkipped() throws Exception {
persistActiveHost("ns1.blah.ib-any.test");
DomainBase nakedDomain =
persistDeletedDomain("todelete.ib-any.test", DateTime.now(UTC).minusYears(1));
@@ -263,7 +260,7 @@ public class DeleteProberDataActionTest extends MapreduceTestCase<DeleteProberDa
}
@Test
public void testFailure_registryAdminClientId_isRequiredForSoftDeletion() {
void testFailure_registryAdminClientId_isRequiredForSoftDeletion() {
persistResource(
newDomainBase("blah.ib-any.test")
.asBuilder()

View File

@@ -52,35 +52,32 @@ import google.registry.model.reporting.HistoryEntry;
import google.registry.schema.cursor.CursorDao;
import google.registry.testing.FakeClock;
import google.registry.testing.FakeResponse;
import google.registry.testing.InjectRule;
import google.registry.testing.InjectExtension;
import google.registry.testing.mapreduce.MapreduceTestCase;
import java.util.ArrayList;
import java.util.List;
import java.util.Optional;
import org.joda.money.Money;
import org.joda.time.DateTime;
import org.junit.Before;
import org.junit.Rule;
import org.junit.Test;
import org.junit.runner.RunWith;
import org.junit.runners.JUnit4;
import org.junit.jupiter.api.BeforeEach;
import org.junit.jupiter.api.Test;
import org.junit.jupiter.api.extension.RegisterExtension;
/** Unit tests for {@link ExpandRecurringBillingEventsAction}. */
@RunWith(JUnit4.class)
public class ExpandRecurringBillingEventsActionTest
extends MapreduceTestCase<ExpandRecurringBillingEventsAction> {
@Rule
public final InjectRule inject = new InjectRule();
@RegisterExtension public final InjectExtension inject = new InjectExtension();
private final DateTime beginningOfTest = DateTime.parse("2000-10-02T00:00:00Z");
private final FakeClock clock = new FakeClock(beginningOfTest);
DomainBase domain;
HistoryEntry historyEntry;
BillingEvent.Recurring recurring;
private DomainBase domain;
private HistoryEntry historyEntry;
private BillingEvent.Recurring recurring;
@Before
public void init() {
@BeforeEach
void beforeEach() {
inject.setStaticField(Ofy.class, "clock", clock);
action = new ExpandRecurringBillingEventsAction();
action.mrRunner = makeDefaultRunner();
@@ -161,7 +158,7 @@ public class ExpandRecurringBillingEventsActionTest
}
@Test
public void testSuccess_expandSingleEvent() throws Exception {
void testSuccess_expandSingleEvent() throws Exception {
persistResource(recurring);
action.cursorTimeParam = Optional.of(START_OF_TIME);
runMapreduce();
@@ -176,7 +173,7 @@ public class ExpandRecurringBillingEventsActionTest
}
@Test
public void testSuccess_expandSingleEvent_deletedDomain() throws Exception {
void testSuccess_expandSingleEvent_deletedDomain() throws Exception {
DateTime deletionTime = DateTime.parse("2000-08-01T00:00:00Z");
DomainBase deletedDomain = persistDeletedDomain("deleted.tld", deletionTime);
historyEntry = persistResource(new HistoryEntry.Builder().setParent(deletedDomain).build());
@@ -208,7 +205,7 @@ public class ExpandRecurringBillingEventsActionTest
}
@Test
public void testSuccess_expandSingleEvent_idempotentForDuplicateRuns() throws Exception {
void testSuccess_expandSingleEvent_idempotentForDuplicateRuns() throws Exception {
persistResource(recurring);
action.cursorTimeParam = Optional.of(START_OF_TIME);
runMapreduce();
@@ -225,7 +222,7 @@ public class ExpandRecurringBillingEventsActionTest
}
@Test
public void testSuccess_expandSingleEvent_idempotentForExistingOneTime() throws Exception {
void testSuccess_expandSingleEvent_idempotentForExistingOneTime() throws Exception {
persistResource(recurring);
BillingEvent.OneTime persisted = persistResource(defaultOneTimeBuilder()
.setParent(historyEntry)
@@ -240,8 +237,7 @@ public class ExpandRecurringBillingEventsActionTest
}
@Test
public void testSuccess_expandSingleEvent_notIdempotentForDifferentBillingTime()
throws Exception {
void testSuccess_expandSingleEvent_notIdempotentForDifferentBillingTime() throws Exception {
persistResource(recurring);
action.cursorTimeParam = Optional.of(START_OF_TIME);
runMapreduce();
@@ -259,8 +255,7 @@ public class ExpandRecurringBillingEventsActionTest
}
@Test
public void testSuccess_expandSingleEvent_notIdempotentForDifferentRecurring()
throws Exception {
void testSuccess_expandSingleEvent_notIdempotentForDifferentRecurring() throws Exception {
persistResource(recurring);
BillingEvent.Recurring recurring2 = persistResource(recurring.asBuilder()
.setId(3L)
@@ -289,7 +284,7 @@ public class ExpandRecurringBillingEventsActionTest
}
@Test
public void testSuccess_ignoreRecurringBeforeWindow() throws Exception {
void testSuccess_ignoreRecurringBeforeWindow() throws Exception {
recurring = persistResource(recurring.asBuilder()
.setEventTime(DateTime.parse("1997-01-05T00:00:00Z"))
.setRecurrenceEndTime(DateTime.parse("1999-10-05T00:00:00Z"))
@@ -303,7 +298,7 @@ public class ExpandRecurringBillingEventsActionTest
}
@Test
public void testSuccess_ignoreRecurringAfterWindow() throws Exception {
void testSuccess_ignoreRecurringAfterWindow() throws Exception {
recurring = persistResource(recurring.asBuilder()
.setEventTime(clock.nowUtc().plusYears(2))
.build());
@@ -315,7 +310,7 @@ public class ExpandRecurringBillingEventsActionTest
}
@Test
public void testSuccess_expandSingleEvent_billingTimeAtCursorTime() throws Exception {
void testSuccess_expandSingleEvent_billingTimeAtCursorTime() throws Exception {
persistResource(recurring);
action.cursorTimeParam = Optional.of(DateTime.parse("2000-02-19T00:00:00Z"));
runMapreduce();
@@ -328,8 +323,7 @@ public class ExpandRecurringBillingEventsActionTest
}
@Test
public void testSuccess_expandSingleEvent_cursorTimeBetweenEventAndBillingTime()
throws Exception {
void testSuccess_expandSingleEvent_cursorTimeBetweenEventAndBillingTime() throws Exception {
persistResource(recurring);
action.cursorTimeParam = Optional.of(DateTime.parse("2000-01-12T00:00:00Z"));
runMapreduce();
@@ -342,7 +336,7 @@ public class ExpandRecurringBillingEventsActionTest
}
@Test
public void testSuccess_expandSingleEvent_billingTimeAtExecutionTime() throws Exception {
void testSuccess_expandSingleEvent_billingTimeAtExecutionTime() throws Exception {
DateTime testTime = DateTime.parse("2000-02-19T00:00:00Z").minusMillis(1);
persistResource(recurring);
action.cursorTimeParam = Optional.of(START_OF_TIME);
@@ -359,7 +353,7 @@ public class ExpandRecurringBillingEventsActionTest
}
@Test
public void testSuccess_expandSingleEvent_multipleYearCreate() throws Exception {
void testSuccess_expandSingleEvent_multipleYearCreate() throws Exception {
DateTime testTime = beginningOfTest.plusYears(2);
action.cursorTimeParam = Optional.of(recurring.getEventTime());
recurring =
@@ -381,7 +375,7 @@ public class ExpandRecurringBillingEventsActionTest
}
@Test
public void testSuccess_expandSingleEvent_withCursor() throws Exception {
void testSuccess_expandSingleEvent_withCursor() throws Exception {
persistResource(recurring);
saveCursor(START_OF_TIME);
runMapreduce();
@@ -394,7 +388,7 @@ public class ExpandRecurringBillingEventsActionTest
}
@Test
public void testSuccess_expandSingleEvent_withCursorPastExpected() throws Exception {
void testSuccess_expandSingleEvent_withCursorPastExpected() throws Exception {
persistResource(recurring);
// Simulate a quick second run of the mapreduce (this should be a no-op).
saveCursor(clock.nowUtc().minusSeconds(1));
@@ -406,7 +400,7 @@ public class ExpandRecurringBillingEventsActionTest
}
@Test
public void testSuccess_expandSingleEvent_recurrenceEndBeforeEvent() throws Exception {
void testSuccess_expandSingleEvent_recurrenceEndBeforeEvent() throws Exception {
// This can occur when a domain is transferred or deleted before a domain comes up for renewal.
recurring = persistResource(recurring.asBuilder()
.setRecurrenceEndTime(recurring.getEventTime().minusDays(5))
@@ -420,7 +414,7 @@ public class ExpandRecurringBillingEventsActionTest
}
@Test
public void testSuccess_expandSingleEvent_dryRun() throws Exception {
void testSuccess_expandSingleEvent_dryRun() throws Exception {
persistResource(recurring);
action.isDryRun = true;
saveCursor(START_OF_TIME); // Need a saved cursor to verify that it didn't move.
@@ -432,7 +426,7 @@ public class ExpandRecurringBillingEventsActionTest
}
@Test
public void testSuccess_expandSingleEvent_multipleYears() throws Exception {
void testSuccess_expandSingleEvent_multipleYears() throws Exception {
DateTime testTime = clock.nowUtc().plusYears(5);
clock.setTo(testTime);
List<BillingEvent> expectedEvents = new ArrayList<>();
@@ -463,7 +457,7 @@ public class ExpandRecurringBillingEventsActionTest
}
@Test
public void testSuccess_expandSingleEvent_multipleYears_cursorInBetweenYears() throws Exception {
void testSuccess_expandSingleEvent_multipleYears_cursorInBetweenYears() throws Exception {
DateTime testTime = clock.nowUtc().plusYears(5);
clock.setTo(testTime);
List<BillingEvent> expectedEvents = new ArrayList<>();
@@ -492,7 +486,7 @@ public class ExpandRecurringBillingEventsActionTest
}
@Test
public void testSuccess_singleEvent_beforeRenewal() throws Exception {
void testSuccess_singleEvent_beforeRenewal() throws Exception {
DateTime testTime = DateTime.parse("2000-01-04T00:00:00Z");
clock.setTo(testTime);
persistResource(recurring);
@@ -505,7 +499,7 @@ public class ExpandRecurringBillingEventsActionTest
}
@Test
public void testSuccess_singleEvent_afterRecurrenceEnd_inAutorenewGracePeriod() throws Exception {
void testSuccess_singleEvent_afterRecurrenceEnd_inAutorenewGracePeriod() throws Exception {
// The domain creation date is 1999-01-05, and the first renewal date is thus 2000-01-05.
DateTime testTime = DateTime.parse("2001-02-06T00:00:00Z");
clock.setTo(testTime);
@@ -530,8 +524,7 @@ public class ExpandRecurringBillingEventsActionTest
}
@Test
public void testSuccess_singleEvent_afterRecurrenceEnd_outsideAutorenewGracePeriod()
throws Exception {
void testSuccess_singleEvent_afterRecurrenceEnd_outsideAutorenewGracePeriod() throws Exception {
// The domain creation date is 1999-01-05, and the first renewal date is thus 2000-01-05.
DateTime testTime = DateTime.parse("2001-02-06T00:00:00Z");
clock.setTo(testTime);
@@ -556,7 +549,7 @@ public class ExpandRecurringBillingEventsActionTest
}
@Test
public void testSuccess_expandSingleEvent_billingTimeOnLeapYear() throws Exception {
void testSuccess_expandSingleEvent_billingTimeOnLeapYear() throws Exception {
recurring =
persistResource(
recurring.asBuilder().setEventTime(DateTime.parse("2000-01-15T00:00:00Z")).build());
@@ -575,7 +568,7 @@ public class ExpandRecurringBillingEventsActionTest
}
@Test
public void testSuccess_expandSingleEvent_billingTimeNotOnLeapYear() throws Exception {
void testSuccess_expandSingleEvent_billingTimeNotOnLeapYear() throws Exception {
DateTime testTime = DateTime.parse("2001-12-01T00:00:00Z");
recurring =
persistResource(
@@ -597,7 +590,7 @@ public class ExpandRecurringBillingEventsActionTest
}
@Test
public void testSuccess_expandMultipleEvents() throws Exception {
void testSuccess_expandMultipleEvents() throws Exception {
persistResource(recurring);
BillingEvent.Recurring recurring2 = persistResource(recurring.asBuilder()
.setEventTime(recurring.getEventTime().plusMonths(3))
@@ -630,7 +623,7 @@ public class ExpandRecurringBillingEventsActionTest
}
@Test
public void testSuccess_premiumDomain() throws Exception {
void testSuccess_premiumDomain() throws Exception {
persistResource(
Registry.get("tld")
.asBuilder()
@@ -651,7 +644,7 @@ public class ExpandRecurringBillingEventsActionTest
}
@Test
public void testSuccess_varyingRenewPrices() throws Exception {
void testSuccess_varyingRenewPrices() throws Exception {
DateTime testTime = beginningOfTest.plusYears(1);
persistResource(
Registry.get("tld")
@@ -691,7 +684,7 @@ public class ExpandRecurringBillingEventsActionTest
}
@Test
public void testFailure_cursorAfterExecutionTime() {
void testFailure_cursorAfterExecutionTime() {
action.cursorTimeParam = Optional.of(clock.nowUtc().plusYears(1));
IllegalArgumentException thrown =
assertThrows(IllegalArgumentException.class, this::runMapreduce);
@@ -701,7 +694,7 @@ public class ExpandRecurringBillingEventsActionTest
}
@Test
public void testFailure_cursorAtExecutionTime() {
void testFailure_cursorAtExecutionTime() {
// The clock advances one milli on runMapreduce.
action.cursorTimeParam = Optional.of(clock.nowUtc().plusMillis(1));
IllegalArgumentException thrown =
@@ -712,7 +705,7 @@ public class ExpandRecurringBillingEventsActionTest
}
@Test
public void testFailure_mapperException_doesNotMoveCursor() throws Exception {
void testFailure_mapperException_doesNotMoveCursor() throws Exception {
saveCursor(START_OF_TIME); // Need a saved cursor to verify that it didn't move.
// Set target to a TLD that doesn't exist.
recurring = persistResource(recurring.asBuilder().setTargetId("domain.junk").build());

View File

@@ -50,7 +50,7 @@ import google.registry.model.server.Lock;
import google.registry.testing.FakeClock;
import google.registry.testing.FakeResponse;
import google.registry.testing.FakeSleeper;
import google.registry.testing.InjectRule;
import google.registry.testing.InjectExtension;
import google.registry.testing.TaskQueueHelper.TaskMatcher;
import google.registry.testing.mapreduce.MapreduceTestCase;
import google.registry.util.AppEngineServiceUtils;
@@ -61,27 +61,24 @@ import google.registry.util.SystemSleeper;
import java.util.Optional;
import org.joda.time.DateTime;
import org.joda.time.Duration;
import org.junit.Before;
import org.junit.Rule;
import org.junit.Test;
import org.junit.runner.RunWith;
import org.junit.runners.JUnit4;
import org.junit.jupiter.api.BeforeEach;
import org.junit.jupiter.api.Test;
import org.junit.jupiter.api.extension.RegisterExtension;
import org.mockito.Mock;
/** Unit tests for {@link RefreshDnsOnHostRenameAction}. */
@RunWith(JUnit4.class)
public class RefreshDnsOnHostRenameActionTest
extends MapreduceTestCase<RefreshDnsOnHostRenameAction> {
@Rule public final InjectRule inject = new InjectRule();
@RegisterExtension public final InjectExtension inject = new InjectExtension();
private AsyncTaskEnqueuer enqueuer;
private final FakeClock clock = new FakeClock(DateTime.parse("2015-01-15T11:22:33Z"));
private final FakeResponse fakeResponse = new FakeResponse();
@Mock private RequestStatusChecker requestStatusChecker;
@Before
public void setup() {
@BeforeEach
void beforeEach() {
createTld("tld");
enqueuer =
AsyncTaskEnqueuerTest.createForTesting(
@@ -124,7 +121,7 @@ public class RefreshDnsOnHostRenameActionTest
}
@Test
public void testSuccess_dnsUpdateEnqueued() throws Exception {
void testSuccess_dnsUpdateEnqueued() throws Exception {
HostResource host = persistActiveHost("ns1.example.tld");
persistResource(newDomainBase("example.tld", host));
persistResource(newDomainBase("otherexample.tld", host));
@@ -141,7 +138,7 @@ public class RefreshDnsOnHostRenameActionTest
}
@Test
public void testSuccess_multipleHostsProcessedInBatch() throws Exception {
void testSuccess_multipleHostsProcessedInBatch() throws Exception {
HostResource host1 = persistActiveHost("ns1.example.tld");
HostResource host2 = persistActiveHost("ns2.example.tld");
HostResource host3 = persistActiveHost("ns3.example.tld");
@@ -165,7 +162,7 @@ public class RefreshDnsOnHostRenameActionTest
}
@Test
public void testSuccess_deletedHost_doesntTriggerDnsRefresh() throws Exception {
void testSuccess_deletedHost_doesntTriggerDnsRefresh() throws Exception {
HostResource host = persistDeletedHost("ns11.fakesss.tld", clock.nowUtc().minusDays(4));
persistResource(newDomainBase("example1.tld", host));
DateTime timeEnqueued = clock.nowUtc();
@@ -180,7 +177,7 @@ public class RefreshDnsOnHostRenameActionTest
}
@Test
public void testSuccess_noDnsTasksForDeletedDomain() throws Exception {
void testSuccess_noDnsTasksForDeletedDomain() throws Exception {
HostResource renamedHost = persistActiveHost("ns1.example.tld");
persistResource(
newDomainBase("example.tld", renamedHost)
@@ -194,7 +191,7 @@ public class RefreshDnsOnHostRenameActionTest
}
@Test
public void testRun_hostDoesntExist_delaysTask() throws Exception {
void testRun_hostDoesntExist_delaysTask() throws Exception {
HostResource host = newHostResource("ns1.example.tld");
enqueuer.enqueueAsyncDnsRefresh(host, clock.nowUtc());
enqueueMapreduceOnly();
@@ -208,7 +205,7 @@ public class RefreshDnsOnHostRenameActionTest
}
@Test
public void test_cannotAcquireLock() {
void test_cannotAcquireLock() {
// Make lock acquisition fail.
acquireLock();
enqueueMapreduceOnly();
@@ -217,7 +214,7 @@ public class RefreshDnsOnHostRenameActionTest
}
@Test
public void test_mapreduceHasWorkToDo_lockIsAcquired() {
void test_mapreduceHasWorkToDo_lockIsAcquired() {
HostResource host = persistActiveHost("ns1.example.tld");
enqueuer.enqueueAsyncDnsRefresh(host, clock.nowUtc());
enqueueMapreduceOnly();
@@ -225,7 +222,7 @@ public class RefreshDnsOnHostRenameActionTest
}
@Test
public void test_noTasksToLease_releasesLockImmediately() throws Exception {
void test_noTasksToLease_releasesLockImmediately() throws Exception {
enqueueMapreduceOnly();
assertNoDnsTasksEnqueued();
assertNoTasksEnqueued(QUEUE_ASYNC_HOST_RENAME);

View File

@@ -34,7 +34,7 @@ import com.google.common.collect.ImmutableSet;
import google.registry.model.domain.DomainBase;
import google.registry.model.host.HostResource;
import google.registry.schema.domain.RegistryLock;
import google.registry.testing.AppEngineRule;
import google.registry.testing.AppEngineExtension;
import google.registry.testing.DeterministicStringGenerator;
import google.registry.testing.FakeClock;
import google.registry.testing.FakeResponse;
@@ -44,14 +44,11 @@ import google.registry.util.AppEngineServiceUtils;
import google.registry.util.StringGenerator.Alphabets;
import java.util.Optional;
import org.joda.time.Duration;
import org.junit.Before;
import org.junit.Rule;
import org.junit.Test;
import org.junit.runner.RunWith;
import org.junit.runners.JUnit4;
import org.junit.jupiter.api.BeforeEach;
import org.junit.jupiter.api.Test;
import org.junit.jupiter.api.extension.RegisterExtension;
/** Unit tests for {@link RelockDomainAction}. */
@RunWith(JUnit4.class)
public class RelockDomainActionTest {
private static final String DOMAIN_NAME = "example.tld";
@@ -67,9 +64,9 @@ public class RelockDomainActionTest {
AsyncTaskEnqueuerTest.createForTesting(
mock(AppEngineServiceUtils.class), clock, Duration.ZERO));
@Rule
public final AppEngineRule appEngineRule =
AppEngineRule.builder()
@RegisterExtension
public final AppEngineExtension appEngineRule =
AppEngineExtension.builder()
.withDatastoreAndCloudSql()
.withUserService(UserInfo.create(POC_ID, "12345"))
.build();
@@ -78,8 +75,8 @@ public class RelockDomainActionTest {
private RegistryLock oldLock;
private RelockDomainAction action;
@Before
public void setup() {
@BeforeEach
void beforeEach() {
createTlds("tld", "net");
HostResource host = persistActiveHost("ns1.example.net");
domain = persistResource(newDomainBase(DOMAIN_NAME, host));
@@ -95,7 +92,7 @@ public class RelockDomainActionTest {
}
@Test
public void testLock() {
void testLock() {
action.run();
assertThat(reloadDomain(domain).getStatusValues())
.containsAtLeastElementsIn(REGISTRY_LOCK_STATUSES);
@@ -107,7 +104,7 @@ public class RelockDomainActionTest {
}
@Test
public void testFailure_unknownCode() {
void testFailure_unknownCode() {
action = createAction(12128675309L);
action.run();
assertThat(response.getStatus()).isEqualTo(SC_NO_CONTENT);
@@ -115,7 +112,7 @@ public class RelockDomainActionTest {
}
@Test
public void testFailure_pendingDelete() {
void testFailure_pendingDelete() {
persistResource(domain.asBuilder().setStatusValues(ImmutableSet.of(PENDING_DELETE)).build());
action.run();
assertThat(response.getStatus()).isEqualTo(SC_NO_CONTENT);
@@ -124,7 +121,7 @@ public class RelockDomainActionTest {
}
@Test
public void testFailure_pendingTransfer() {
void testFailure_pendingTransfer() {
persistResource(domain.asBuilder().setStatusValues(ImmutableSet.of(PENDING_TRANSFER)).build());
action.run();
assertThat(response.getStatus()).isEqualTo(SC_NO_CONTENT);
@@ -133,7 +130,7 @@ public class RelockDomainActionTest {
}
@Test
public void testFailure_domainAlreadyLocked() {
void testFailure_domainAlreadyLocked() {
domainLockUtils.administrativelyApplyLock(DOMAIN_NAME, CLIENT_ID, null, true);
action.run();
assertThat(response.getStatus()).isEqualTo(SC_NO_CONTENT);
@@ -142,7 +139,7 @@ public class RelockDomainActionTest {
}
@Test
public void testFailure_domainDeleted() {
void testFailure_domainDeleted() {
persistDomainAsDeleted(domain, clock.nowUtc());
action.run();
assertThat(response.getStatus()).isEqualTo(SC_NO_CONTENT);
@@ -151,7 +148,7 @@ public class RelockDomainActionTest {
}
@Test
public void testFailure_domainTransferred() {
void testFailure_domainTransferred() {
persistResource(domain.asBuilder().setPersistedCurrentSponsorClientId("NewRegistrar").build());
action.run();
assertThat(response.getStatus()).isEqualTo(SC_NO_CONTENT);
@@ -164,7 +161,7 @@ public class RelockDomainActionTest {
}
@Test
public void testFailure_relockAlreadySet() {
void testFailure_relockAlreadySet() {
RegistryLock newLock =
domainLockUtils.administrativelyApplyLock(DOMAIN_NAME, CLIENT_ID, null, true);
saveRegistryLock(oldLock.asBuilder().setRelock(newLock).build());

View File

@@ -25,18 +25,14 @@ import google.registry.model.transfer.TransferStatus;
import google.registry.testing.FakeResponse;
import google.registry.testing.mapreduce.MapreduceTestCase;
import org.joda.time.DateTime;
import org.junit.Before;
import org.junit.Test;
import org.junit.runner.RunWith;
import org.junit.runners.JUnit4;
import org.junit.jupiter.api.BeforeEach;
import org.junit.jupiter.api.Test;
/** Unit tests for {@link ResaveAllEppResourcesAction}. */
@RunWith(JUnit4.class)
public class ResaveAllEppResourcesActionTest
extends MapreduceTestCase<ResaveAllEppResourcesAction> {
class ResaveAllEppResourcesActionTest extends MapreduceTestCase<ResaveAllEppResourcesAction> {
@Before
public void init() {
@BeforeEach
void beforeEach() {
action = new ResaveAllEppResourcesAction();
action.mrRunner = makeDefaultRunner();
action.response = new FakeResponse();
@@ -48,7 +44,7 @@ public class ResaveAllEppResourcesActionTest
}
@Test
public void test_mapreduceSuccessfullyResavesEntity() throws Exception {
void test_mapreduceSuccessfullyResavesEntity() throws Exception {
ContactResource contact = persistActiveContact("test123");
DateTime creationTime = contact.getUpdateTimestamp().getTimestamp();
assertThat(ofy().load().entity(contact).now().getUpdateTimestamp().getTimestamp())
@@ -60,7 +56,7 @@ public class ResaveAllEppResourcesActionTest
}
@Test
public void test_mapreduceResolvesPendingTransfer() throws Exception {
void test_mapreduceResolvesPendingTransfer() throws Exception {
DateTime now = DateTime.now(UTC);
// Set up a contact with a transfer that implicitly completed five days ago.
ContactResource contact =

View File

@@ -42,40 +42,39 @@ import google.registry.model.domain.rgp.GracePeriodStatus;
import google.registry.model.eppcommon.StatusValue;
import google.registry.model.ofy.Ofy;
import google.registry.request.Response;
import google.registry.testing.AppEngineRule;
import google.registry.testing.AppEngineExtension;
import google.registry.testing.FakeClock;
import google.registry.testing.InjectRule;
import google.registry.testing.InjectExtension;
import google.registry.testing.TaskQueueHelper.TaskMatcher;
import google.registry.util.AppEngineServiceUtils;
import org.joda.time.DateTime;
import org.joda.time.Duration;
import org.junit.Before;
import org.junit.Rule;
import org.junit.Test;
import org.junit.runner.RunWith;
import org.junit.runners.JUnit4;
import org.junit.jupiter.api.BeforeEach;
import org.junit.jupiter.api.Test;
import org.junit.jupiter.api.extension.ExtendWith;
import org.junit.jupiter.api.extension.RegisterExtension;
import org.mockito.Mock;
import org.mockito.junit.MockitoJUnit;
import org.mockito.junit.MockitoRule;
import org.mockito.junit.jupiter.MockitoExtension;
import org.mockito.junit.jupiter.MockitoSettings;
import org.mockito.quality.Strictness;
/** Unit tests for {@link ResaveEntityAction}. */
@RunWith(JUnit4.class)
@ExtendWith(MockitoExtension.class)
public class ResaveEntityActionTest {
@Rule
public final AppEngineRule appEngine =
AppEngineRule.builder().withDatastoreAndCloudSql().withTaskQueue().build();
@RegisterExtension
public final AppEngineExtension appEngine =
AppEngineExtension.builder().withDatastoreAndCloudSql().withTaskQueue().build();
@Rule public final InjectRule inject = new InjectRule();
@Rule public final MockitoRule mocks = MockitoJUnit.rule();
@RegisterExtension public final InjectExtension inject = new InjectExtension();
@Mock private AppEngineServiceUtils appEngineServiceUtils;
@Mock private Response response;
private final FakeClock clock = new FakeClock(DateTime.parse("2016-02-11T10:00:00Z"));
private AsyncTaskEnqueuer asyncTaskEnqueuer;
@Before
public void before() {
@BeforeEach
void beforeEach() {
inject.setStaticField(Ofy.class, "clock", clock);
when(appEngineServiceUtils.getServiceHostname("backend")).thenReturn("backend.hostname.fake");
asyncTaskEnqueuer =
@@ -93,8 +92,9 @@ public class ResaveEntityActionTest {
action.run();
}
@MockitoSettings(strictness = Strictness.LENIENT)
@Test
public void test_domainPendingTransfer_isResavedAndTransferCompleted() {
void test_domainPendingTransfer_isResavedAndTransferCompleted() {
DomainBase domain =
persistDomainWithPendingTransfer(
persistDomainWithDependentResources(
@@ -116,7 +116,7 @@ public class ResaveEntityActionTest {
}
@Test
public void test_domainPendingDeletion_isResavedAndReenqueued() {
void test_domainPendingDeletion_isResavedAndReenqueued() {
DomainBase domain =
persistResource(
newDomainBase("domain.tld")

View File

@@ -22,14 +22,11 @@ import org.apache.avro.Schema;
import org.apache.avro.generic.GenericData;
import org.apache.avro.generic.GenericRecord;
import org.apache.beam.sdk.io.gcp.bigquery.SchemaAndRecord;
import org.junit.Before;
import org.junit.Test;
import org.junit.runner.RunWith;
import org.junit.runners.JUnit4;
import org.junit.jupiter.api.BeforeEach;
import org.junit.jupiter.api.Test;
/** Unit tests for {@link BeamUtils} */
@RunWith(JUnit4.class)
public class BeamUtilsTest {
class BeamUtilsTest {
private static final String GENERIC_SCHEMA =
"{\"name\": \"AnObject\", "
@@ -41,8 +38,8 @@ public class BeamUtilsTest {
private SchemaAndRecord schemaAndRecord;
@Before
public void initializeRecord() {
@BeforeEach
void beforeEach() {
// Create a record with a given JSON schema.
GenericRecord record = new GenericData.Record(new Schema.Parser().parse(GENERIC_SCHEMA));
record.put("aString", "hello world");
@@ -51,26 +48,26 @@ public class BeamUtilsTest {
}
@Test
public void testExtractField_fieldExists_returnsExpectedStringValues() {
void testExtractField_fieldExists_returnsExpectedStringValues() {
assertThat(BeamUtils.extractField(schemaAndRecord.getRecord(), "aString"))
.isEqualTo("hello world");
assertThat(BeamUtils.extractField(schemaAndRecord.getRecord(), "aFloat")).isEqualTo("2.54");
}
@Test
public void testExtractField_fieldDoesntExist_returnsNull() {
void testExtractField_fieldDoesntExist_returnsNull() {
schemaAndRecord.getRecord().put("aFloat", null);
assertThat(BeamUtils.extractField(schemaAndRecord.getRecord(), "aFloat")).isEqualTo("null");
assertThat(BeamUtils.extractField(schemaAndRecord.getRecord(), "missing")).isEqualTo("null");
}
@Test
public void testCheckFieldsNotNull_noExceptionIfAllPresent() {
void testCheckFieldsNotNull_noExceptionIfAllPresent() {
BeamUtils.checkFieldsNotNull(ImmutableList.of("aString", "aFloat"), schemaAndRecord);
}
@Test
public void testCheckFieldsNotNull_fieldMissing_throwsException() {
void testCheckFieldsNotNull_fieldMissing_throwsException() {
IllegalStateException expected =
assertThrows(
IllegalStateException.class,

View File

@@ -0,0 +1,542 @@
// Copyright 2020 The Nomulus Authors. All Rights Reserved.
// This applies to our modifications; the base file's license header is:
/*
* Licensed to the Apache Software Foundation (ASF) under one
* or more contributor license agreements. See the NOTICE file
* distributed with this work for additional information
* regarding copyright ownership. The ASF licenses this file
* to you under the Apache License, Version 2.0 (the
* "License"); you may not use this file except in compliance
* with the License. You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package google.registry.beam;
import static com.google.common.base.Preconditions.checkState;
import static org.hamcrest.MatcherAssert.assertThat;
import static org.hamcrest.Matchers.is;
import com.fasterxml.jackson.databind.ObjectMapper;
import com.google.common.base.Strings;
import com.google.common.collect.Maps;
import java.io.IOException;
import java.lang.annotation.Annotation;
import java.util.ArrayList;
import java.util.Arrays;
import java.util.List;
import java.util.Map;
import java.util.Optional;
import java.util.UUID;
import java.util.function.Predicate;
import javax.annotation.Nullable;
import org.apache.beam.sdk.Pipeline;
import org.apache.beam.sdk.PipelineResult;
import org.apache.beam.sdk.annotations.Internal;
import org.apache.beam.sdk.io.FileSystems;
import org.apache.beam.sdk.metrics.MetricNameFilter;
import org.apache.beam.sdk.metrics.MetricResult;
import org.apache.beam.sdk.metrics.MetricsEnvironment;
import org.apache.beam.sdk.metrics.MetricsFilter;
import org.apache.beam.sdk.options.ApplicationNameOptions;
import org.apache.beam.sdk.options.PipelineOptions;
import org.apache.beam.sdk.options.PipelineOptions.CheckEnabled;
import org.apache.beam.sdk.options.PipelineOptionsFactory;
import org.apache.beam.sdk.options.ValueProvider;
import org.apache.beam.sdk.options.ValueProvider.StaticValueProvider;
import org.apache.beam.sdk.runners.TransformHierarchy;
import org.apache.beam.sdk.runners.TransformHierarchy.Node;
import org.apache.beam.sdk.testing.CrashingRunner;
import org.apache.beam.sdk.testing.NeedsRunner;
import org.apache.beam.sdk.testing.PAssert;
import org.apache.beam.sdk.testing.TestPipelineOptions;
import org.apache.beam.sdk.testing.ValidatesRunner;
import org.apache.beam.sdk.transforms.SerializableFunction;
import org.apache.beam.sdk.util.common.ReflectHelpers;
import org.junit.experimental.categories.Category;
import org.junit.jupiter.api.extension.AfterEachCallback;
import org.junit.jupiter.api.extension.BeforeEachCallback;
import org.junit.jupiter.api.extension.ExtensionContext;
// NOTE: This file is copied from the Apache Beam distribution so that it can be locally modified to
// support JUnit 5.
/**
* A creator of test pipelines that can be used inside of tests that can be configured to run
* locally or against a remote pipeline runner.
*
* <p>In order to run tests on a pipeline runner, the following conditions must be met:
*
* <ul>
* <li>System property "beamTestPipelineOptions" must contain a JSON delimited list of pipeline
* options. For example:
* <pre>{@code [
* "--runner=TestDataflowRunner",
* "--project=mygcpproject",
* "--stagingLocation=gs://mygcsbucket/path"
* ]}</pre>
* Note that the set of pipeline options required is pipeline runner specific.
* <li>Jars containing the SDK and test classes must be available on the classpath.
* </ul>
*
* <p>Use {@link PAssert} for tests, as it integrates with this test harness in both direct and
* remote execution modes. For example:
*
* <pre><code>
* {@literal @Rule}
* public final transient TestPipeline p = TestPipeline.create();
*
* {@literal @Test}
* {@literal @Category}(NeedsRunner.class)
* public void myPipelineTest() throws Exception {
* final PCollection&lt;String&gt; pCollection = pipeline.apply(...)
* PAssert.that(pCollection).containsInAnyOrder(...);
* pipeline.run();
* }
* </code></pre>
*
* <p>For pipeline runners, it is required that they must throw an {@link AssertionError} containing
* the message from the {@link PAssert} that failed.
*
* <p>See also the <a href="https://beam.apache.org/contribute/testing/">Testing</a> documentation
* section.
*/
public class TestPipelineExtension extends Pipeline
implements BeforeEachCallback, AfterEachCallback {
private final PipelineOptions options;
private static class PipelineRunEnforcement {
@SuppressWarnings("WeakerAccess")
protected boolean enableAutoRunIfMissing;
protected final Pipeline pipeline;
protected boolean runAttempted;
private PipelineRunEnforcement(final Pipeline pipeline) {
this.pipeline = pipeline;
}
protected void enableAutoRunIfMissing(final boolean enable) {
enableAutoRunIfMissing = enable;
}
protected void beforePipelineExecution() {
runAttempted = true;
}
protected void afterPipelineExecution() {}
protected void afterUserCodeFinished() {
if (!runAttempted && enableAutoRunIfMissing) {
pipeline.run().waitUntilFinish();
}
}
}
private static class PipelineAbandonedNodeEnforcement extends PipelineRunEnforcement {
// Null until the pipeline has been run
@Nullable private List<TransformHierarchy.Node> runVisitedNodes;
private final Predicate<Node> isPAssertNode =
node ->
node.getTransform() instanceof PAssert.GroupThenAssert
|| node.getTransform() instanceof PAssert.GroupThenAssertForSingleton
|| node.getTransform() instanceof PAssert.OneSideInputAssert;
private static class NodeRecorder extends PipelineVisitor.Defaults {
private final List<TransformHierarchy.Node> visited = new ArrayList<>();
@Override
public void leaveCompositeTransform(final TransformHierarchy.Node node) {
visited.add(node);
}
@Override
public void visitPrimitiveTransform(final TransformHierarchy.Node node) {
visited.add(node);
}
}
private PipelineAbandonedNodeEnforcement(final TestPipelineExtension pipeline) {
super(pipeline);
runVisitedNodes = null;
}
private List<TransformHierarchy.Node> recordPipelineNodes(final Pipeline pipeline) {
final NodeRecorder nodeRecorder = new NodeRecorder();
pipeline.traverseTopologically(nodeRecorder);
return nodeRecorder.visited;
}
private boolean isEmptyPipeline(final Pipeline pipeline) {
final IsEmptyVisitor isEmptyVisitor = new IsEmptyVisitor();
pipeline.traverseTopologically(isEmptyVisitor);
return isEmptyVisitor.isEmpty();
}
private void verifyPipelineExecution() {
if (!isEmptyPipeline(pipeline)) {
if (!runAttempted && !enableAutoRunIfMissing) {
throw new PipelineRunMissingException("The pipeline has not been run.");
} else {
final List<TransformHierarchy.Node> pipelineNodes = recordPipelineNodes(pipeline);
if (pipelineRunSucceeded() && !visitedAll(pipelineNodes)) {
final boolean hasDanglingPAssert =
pipelineNodes.stream()
.filter(pn -> !runVisitedNodes.contains(pn))
.anyMatch(isPAssertNode);
if (hasDanglingPAssert) {
throw new AbandonedNodeException("The pipeline contains abandoned PAssert(s).");
} else {
throw new AbandonedNodeException("The pipeline contains abandoned PTransform(s).");
}
}
}
}
}
private boolean visitedAll(final List<TransformHierarchy.Node> pipelineNodes) {
return runVisitedNodes.equals(pipelineNodes);
}
private boolean pipelineRunSucceeded() {
return runVisitedNodes != null;
}
@Override
protected void afterPipelineExecution() {
runVisitedNodes = recordPipelineNodes(pipeline);
super.afterPipelineExecution();
}
@Override
protected void afterUserCodeFinished() {
super.afterUserCodeFinished();
verifyPipelineExecution();
}
}
/**
* An exception thrown in case an abandoned {@link org.apache.beam.sdk.transforms.PTransform} is
* detected, that is, a {@link org.apache.beam.sdk.transforms.PTransform} that has not been run.
*/
public static class AbandonedNodeException extends RuntimeException {
AbandonedNodeException(final String msg) {
super(msg);
}
}
/** An exception thrown in case a test finishes without invoking {@link Pipeline#run()}. */
public static class PipelineRunMissingException extends RuntimeException {
PipelineRunMissingException(final String msg) {
super(msg);
}
}
/** System property used to set {@link TestPipelineOptions}. */
public static final String PROPERTY_BEAM_TEST_PIPELINE_OPTIONS = "beamTestPipelineOptions";
static final String PROPERTY_USE_DEFAULT_DUMMY_RUNNER = "beamUseDummyRunner";
private static final ObjectMapper MAPPER =
new ObjectMapper()
.registerModules(ObjectMapper.findModules(ReflectHelpers.findClassLoader()));
@SuppressWarnings("OptionalUsedAsFieldOrParameterType")
private Optional<? extends PipelineRunEnforcement> enforcement = Optional.empty();
/**
* Creates and returns a new test pipeline.
*
* <p>Use {@link PAssert} to add tests, then call {@link Pipeline#run} to execute the pipeline and
* check the tests.
*/
public static TestPipelineExtension create() {
return fromOptions(testingPipelineOptions());
}
public static TestPipelineExtension fromOptions(PipelineOptions options) {
return new TestPipelineExtension(options);
}
private TestPipelineExtension(final PipelineOptions options) {
super(options);
this.options = options;
}
@Override
public PipelineOptions getOptions() {
return this.options;
}
@Override
public void beforeEach(ExtensionContext context) throws Exception {
options.as(ApplicationNameOptions.class).setAppName(getAppName(context));
// if the enforcement level has not been set by the user do auto-inference
if (!enforcement.isPresent()) {
final boolean isCrashingRunner = CrashingRunner.class.isAssignableFrom(options.getRunner());
checkState(
!isCrashingRunner,
"Cannot test using a [%s] runner. Please re-check your configuration.",
CrashingRunner.class.getSimpleName());
enableAbandonedNodeEnforcement(true);
}
}
@Override
public void afterEach(ExtensionContext context) throws Exception {
enforcement.get().afterUserCodeFinished();
}
/** Returns the class + method name of the test. */
private String getAppName(ExtensionContext context) {
String methodName = context.getRequiredTestMethod().getName();
Class<?> testClass = context.getRequiredTestClass();
if (testClass.isMemberClass()) {
return String.format(
"%s$%s-%s",
testClass.getEnclosingClass().getSimpleName(), testClass.getSimpleName(), methodName);
} else {
return String.format("%s-%s", testClass.getSimpleName(), methodName);
}
}
/**
* Runs this {@link TestPipelineExtension}, unwrapping any {@code AssertionError} that is raised
* during testing.
*/
@Override
public PipelineResult run() {
return run(getOptions());
}
/** Like {@link #run} but with the given potentially modified options. */
@Override
public PipelineResult run(PipelineOptions options) {
checkState(
enforcement.isPresent(),
"Is your TestPipeline declaration missing a @Rule annotation? Usage: "
+ "@Rule public final transient TestPipeline pipeline = TestPipeline.create();");
final PipelineResult pipelineResult;
try {
enforcement.get().beforePipelineExecution();
PipelineOptions updatedOptions =
MAPPER.convertValue(MAPPER.valueToTree(options), PipelineOptions.class);
updatedOptions
.as(TestValueProviderOptions.class)
.setProviderRuntimeValues(StaticValueProvider.of(providerRuntimeValues));
pipelineResult = super.run(updatedOptions);
verifyPAssertsSucceeded(this, pipelineResult);
} catch (RuntimeException exc) {
Throwable cause = exc.getCause();
if (cause instanceof AssertionError) {
throw (AssertionError) cause;
} else {
throw exc;
}
}
// If we reach this point, the pipeline has been run and no exceptions have been thrown during
// its execution.
enforcement.get().afterPipelineExecution();
return pipelineResult;
}
/** Implementation detail of {@link #newProvider}, do not use. */
@Internal
public interface TestValueProviderOptions extends PipelineOptions {
ValueProvider<Map<String, Object>> getProviderRuntimeValues();
void setProviderRuntimeValues(ValueProvider<Map<String, Object>> runtimeValues);
}
/**
* Returns a new {@link ValueProvider} that is inaccessible before {@link #run}, but will be
* accessible while the pipeline runs.
*/
public <T> ValueProvider<T> newProvider(T runtimeValue) {
String uuid = UUID.randomUUID().toString();
providerRuntimeValues.put(uuid, runtimeValue);
return ValueProvider.NestedValueProvider.of(
options.as(TestValueProviderOptions.class).getProviderRuntimeValues(),
new GetFromRuntimeValues<T>(uuid));
}
private final Map<String, Object> providerRuntimeValues = Maps.newHashMap();
private static class GetFromRuntimeValues<T>
implements SerializableFunction<Map<String, Object>, T> {
private final String key;
private GetFromRuntimeValues(String key) {
this.key = key;
}
@Override
public T apply(Map<String, Object> input) {
return (T) input.get(key);
}
}
/**
* Enables the abandoned node detection. Abandoned nodes are <code>PTransforms</code>, <code>
* PAsserts</code> included, that were not executed by the pipeline runner. Abandoned nodes are
* most likely to occur due to the one of the following scenarios:
*
* <ul>
* <li>Lack of a <code>pipeline.run()</code> statement at the end of a test.
* <li>Addition of PTransforms after the pipeline has already run.
* </ul>
*
* Abandoned node detection is automatically enabled when a real pipeline runner (i.e. not a
* {@link CrashingRunner}) and/or a {@link NeedsRunner} or a {@link ValidatesRunner} annotation
* are detected.
*/
public TestPipelineExtension enableAbandonedNodeEnforcement(final boolean enable) {
enforcement =
enable
? Optional.of(new PipelineAbandonedNodeEnforcement(this))
: Optional.of(new PipelineRunEnforcement(this));
return this;
}
/**
* If enabled, a <code>pipeline.run()</code> statement will be added automatically in case it is
* missing in the test.
*/
public TestPipelineExtension enableAutoRunIfMissing(final boolean enable) {
enforcement.get().enableAutoRunIfMissing(enable);
return this;
}
@Override
public String toString() {
return "TestPipeline#" + options.as(ApplicationNameOptions.class).getAppName();
}
/** Creates {@link PipelineOptions} for testing. */
public static PipelineOptions testingPipelineOptions() {
try {
@Nullable
String beamTestPipelineOptions = System.getProperty(PROPERTY_BEAM_TEST_PIPELINE_OPTIONS);
PipelineOptions options =
Strings.isNullOrEmpty(beamTestPipelineOptions)
? PipelineOptionsFactory.create()
: PipelineOptionsFactory.fromArgs(
MAPPER.readValue(beamTestPipelineOptions, String[].class))
.as(TestPipelineOptions.class);
// If no options were specified, set some reasonable defaults
if (Strings.isNullOrEmpty(beamTestPipelineOptions)) {
// If there are no provided options, check to see if a dummy runner should be used.
String useDefaultDummy = System.getProperty(PROPERTY_USE_DEFAULT_DUMMY_RUNNER);
if (!Strings.isNullOrEmpty(useDefaultDummy) && Boolean.valueOf(useDefaultDummy)) {
options.setRunner(CrashingRunner.class);
}
}
options.setStableUniqueNames(CheckEnabled.ERROR);
FileSystems.setDefaultPipelineOptions(options);
return options;
} catch (IOException e) {
throw new RuntimeException(
"Unable to instantiate test options from system property "
+ PROPERTY_BEAM_TEST_PIPELINE_OPTIONS
+ ":"
+ System.getProperty(PROPERTY_BEAM_TEST_PIPELINE_OPTIONS),
e);
}
}
/**
* Verifies all {{@link PAssert PAsserts}} in the pipeline have been executed and were successful.
*
* <p>Note this only runs for runners which support Metrics. Runners which do not should verify
* this in some other way. See: https://issues.apache.org/jira/browse/BEAM-2001
*/
public static void verifyPAssertsSucceeded(Pipeline pipeline, PipelineResult pipelineResult) {
if (MetricsEnvironment.isMetricsSupported()) {
long expectedNumberOfAssertions = (long) PAssert.countAsserts(pipeline);
long successfulAssertions = 0;
Iterable<MetricResult<Long>> successCounterResults =
pipelineResult
.metrics()
.queryMetrics(
MetricsFilter.builder()
.addNameFilter(MetricNameFilter.named(PAssert.class, PAssert.SUCCESS_COUNTER))
.build())
.getCounters();
for (MetricResult<Long> counter : successCounterResults) {
if (counter.getAttempted() > 0) {
successfulAssertions++;
}
}
assertThat(
String.format(
"Expected %d successful assertions, but found %d.",
expectedNumberOfAssertions, successfulAssertions),
successfulAssertions,
is(expectedNumberOfAssertions));
}
}
private static class IsEmptyVisitor extends PipelineVisitor.Defaults {
private boolean empty = true;
public boolean isEmpty() {
return empty;
}
@Override
public void visitPrimitiveTransform(TransformHierarchy.Node node) {
empty = false;
}
}
/**
* A utility class for querying annotations.
*
* <p>NOTE: This was copied from the Apache Beam project from a separate file only for visibility
* reasons (it's package-private there).
*/
static class Annotations {
/** Annotation predicates. */
static class Predicates {
static Predicate<Annotation> isAnnotationOfType(final Class<? extends Annotation> clazz) {
return annotation ->
annotation.annotationType() != null && annotation.annotationType().equals(clazz);
}
static Predicate<Annotation> isCategoryOf(final Class<?> value, final boolean allowDerived) {
return category ->
Arrays.stream(((Category) category).value())
.anyMatch(
aClass -> allowDerived ? value.isAssignableFrom(aClass) : value.equals(aClass));
}
}
}
}

View File

@@ -26,7 +26,7 @@ import com.googlecode.objectify.Key;
import google.registry.backup.CommitLogExports;
import google.registry.backup.VersionedEntity;
import google.registry.model.ofy.CommitLogCheckpoint;
import google.registry.testing.AppEngineRule;
import google.registry.testing.AppEngineExtension;
import google.registry.testing.FakeClock;
import google.registry.tools.LevelDbFileBuilder;
import java.io.File;
@@ -53,7 +53,7 @@ class BackupTestStore implements AutoCloseable {
DateTimeFormat.forPattern("yyyy-MM-dd'T'HH:mm:ss_SSS");
private final FakeClock fakeClock;
private AppEngineRule appEngine;
private AppEngineExtension appEngine;
/** For fetching the persisted Datastore Entity directly. */
private DatastoreService datastoreService;
@@ -62,12 +62,12 @@ class BackupTestStore implements AutoCloseable {
BackupTestStore(FakeClock fakeClock) throws Exception {
this.fakeClock = fakeClock;
this.appEngine =
new AppEngineRule.Builder()
new AppEngineExtension.Builder()
.withDatastore()
.withoutCannedData()
.withClock(fakeClock)
.build();
this.appEngine.beforeEach(null);
this.appEngine.setUp();
datastoreService = DatastoreServiceFactory.getDatastoreService();
}
@@ -186,7 +186,7 @@ class BackupTestStore implements AutoCloseable {
@Override
public void close() throws Exception {
if (appEngine != null) {
appEngine.afterEach(null);
appEngine.tearDown();
appEngine = null;
}
}

View File

@@ -35,7 +35,7 @@ import google.registry.model.ofy.Ofy;
import google.registry.model.registry.Registry;
import google.registry.persistence.VKey;
import google.registry.testing.FakeClock;
import google.registry.testing.InjectRule;
import google.registry.testing.InjectExtension;
import google.registry.tools.LevelDbLogReader;
import java.io.File;
import java.io.IOException;
@@ -58,7 +58,7 @@ public class BackupTestStoreTest {
@TempDir File tempDir;
@RegisterExtension InjectRule injectRule = new InjectRule();
@RegisterExtension InjectExtension injectRule = new InjectExtension();
private FakeClock fakeClock;
private BackupTestStore store;

View File

@@ -0,0 +1,72 @@
// Copyright 2020 The Nomulus Authors. All Rights Reserved.
//
// Licensed under the Apache License, Version 2.0 (the "License");
// you may not use this file except in compliance with the License.
// You may obtain a copy of the License at
//
// http://www.apache.org/licenses/LICENSE-2.0
//
// Unless required by applicable law or agreed to in writing, software
// distributed under the License is distributed on an "AS IS" BASIS,
// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
// See the License for the specific language governing permissions and
// limitations under the License.
package google.registry.beam.initsql;
import java.io.File;
import java.io.IOException;
import java.io.PrintStream;
import java.io.Serializable;
import java.nio.file.Files;
import java.nio.file.Path;
import java.util.function.Supplier;
import org.junit.jupiter.api.extension.AfterEachCallback;
import org.junit.jupiter.api.extension.BeforeEachCallback;
import org.junit.jupiter.api.extension.ExtensionContext;
import org.testcontainers.containers.JdbcDatabaseContainer;
/**
* Helpers for setting up {@link BeamJpaModule} in tests.
*
* <p>This extension is often used with a Database container and/or temporary file folder. User must
* make sure that all dependent extensions are set up before this extension, e.g., by assigning
* {@link org.junit.jupiter.api.Order orders}.
*/
public final class BeamJpaExtension implements BeforeEachCallback, AfterEachCallback, Serializable {
private final transient JdbcDatabaseContainer<?> database;
private final transient Supplier<Path> credentialPathSupplier;
private transient BeamJpaModule beamJpaModule;
private File credentialFile;
public BeamJpaExtension(Supplier<Path> credentialPathSupplier, JdbcDatabaseContainer database) {
this.database = database;
this.credentialPathSupplier = credentialPathSupplier;
}
public File getCredentialFile() {
return credentialFile;
}
public BeamJpaModule getBeamJpaModule() {
if (beamJpaModule != null) {
return beamJpaModule;
}
return beamJpaModule = new BeamJpaModule(credentialFile.getAbsolutePath(), null);
}
@Override
public void beforeEach(ExtensionContext context) throws IOException {
credentialFile = Files.createFile(credentialPathSupplier.get()).toFile();
new PrintStream(credentialFile)
.printf("%s %s %s", database.getJdbcUrl(), database.getUsername(), database.getPassword())
.close();
}
@Override
public void afterEach(ExtensionContext context) {
credentialFile.delete();
}
}

View File

@@ -19,12 +19,10 @@ import static com.google.common.truth.Truth.assertThat;
import google.registry.persistence.NomulusPostgreSql;
import google.registry.persistence.transaction.JpaTransactionManager;
import google.registry.testing.DatastoreEntityExtension;
import java.io.File;
import java.io.IOException;
import java.io.PrintStream;
import java.nio.file.Path;
import org.apache.beam.sdk.io.FileSystems;
import org.apache.beam.sdk.options.PipelineOptionsFactory;
import org.junit.jupiter.api.BeforeEach;
import org.junit.jupiter.api.Order;
import org.junit.jupiter.api.Test;
import org.junit.jupiter.api.condition.EnabledIfSystemProperty;
import org.junit.jupiter.api.extension.RegisterExtension;
@@ -35,31 +33,28 @@ import org.testcontainers.junit.jupiter.Testcontainers;
/** Unit tests for {@link BeamJpaModule}. */
@Testcontainers
public class BeamJpaModuleTest {
@Container
public PostgreSQLContainer database = new PostgreSQLContainer(NomulusPostgreSql.getDockerTag());
class BeamJpaModuleTest {
@RegisterExtension
public DatastoreEntityExtension datastoreEntityExtension = new DatastoreEntityExtension();
final DatastoreEntityExtension datastoreEntityExtension = new DatastoreEntityExtension();
@TempDir File tempFolder;
@Container
final PostgreSQLContainer database = new PostgreSQLContainer(NomulusPostgreSql.getDockerTag());
private File credentialFile;
@SuppressWarnings("WeakerAccess")
@TempDir
Path tmpDir;
@BeforeEach
public void beforeEach() throws IOException {
credentialFile = new File(tempFolder, "credential");
new PrintStream(credentialFile)
.printf("%s %s %s", database.getJdbcUrl(), database.getUsername(), database.getPassword())
.close();
}
@RegisterExtension
@Order(Order.DEFAULT + 1)
final BeamJpaExtension beamJpaExtension =
new BeamJpaExtension(() -> tmpDir.resolve("credential.dat"), database);
@Test
void getJpaTransactionManager_local() {
JpaTransactionManager jpa =
DaggerBeamJpaModule_JpaTransactionManagerComponent.builder()
.beamJpaModule(new BeamJpaModule(credentialFile.getAbsolutePath()))
.beamJpaModule(beamJpaExtension.getBeamJpaModule())
.build()
.localDbJpaTransactionManager();
assertThat(
@@ -80,14 +75,15 @@ public class BeamJpaModuleTest {
*/
@Test
@EnabledIfSystemProperty(named = "test.gcp_integration.env", matches = "\\S+")
public void getJpaTransactionManager_cloudSql_authRequired() {
void getJpaTransactionManager_cloudSql_authRequired() {
String environmentName = System.getProperty("test.gcp_integration.env");
FileSystems.setDefaultPipelineOptions(PipelineOptionsFactory.create());
JpaTransactionManager jpa =
DaggerBeamJpaModule_JpaTransactionManagerComponent.builder()
.beamJpaModule(
new BeamJpaModule(
BackupPaths.getCloudSQLCredentialFilePatterns(environmentName).get(0)))
BackupPaths.getCloudSQLCredentialFilePatterns(environmentName).get(0),
String.format("domain-registry-%s", environmentName)))
.build()
.cloudSqlJpaTransactionManager();
assertThat(

View File

@@ -21,49 +21,47 @@ import static google.registry.testing.DatastoreHelper.newRegistry;
import com.google.common.collect.ImmutableList;
import com.google.common.collect.ImmutableSet;
import google.registry.backup.VersionedEntity;
import google.registry.beam.TestPipelineExtension;
import google.registry.model.contact.ContactResource;
import google.registry.model.domain.DomainBase;
import google.registry.model.ofy.Ofy;
import google.registry.model.registry.Registry;
import google.registry.testing.FakeClock;
import google.registry.testing.InjectRule;
import google.registry.testing.InjectExtension;
import java.io.File;
import java.io.IOException;
import java.io.Serializable;
import java.nio.file.Files;
import java.nio.file.Path;
import org.apache.beam.sdk.coders.StringUtf8Coder;
import org.apache.beam.sdk.io.fs.MatchResult.Metadata;
import org.apache.beam.sdk.testing.NeedsRunner;
import org.apache.beam.sdk.testing.PAssert;
import org.apache.beam.sdk.testing.TestPipeline;
import org.apache.beam.sdk.transforms.Create;
import org.apache.beam.sdk.transforms.DoFn;
import org.apache.beam.sdk.transforms.ParDo;
import org.apache.beam.sdk.values.KV;
import org.apache.beam.sdk.values.PCollection;
import org.joda.time.DateTime;
import org.junit.After;
import org.junit.Before;
import org.junit.Rule;
import org.junit.Test;
import org.junit.experimental.categories.Category;
import org.junit.rules.TemporaryFolder;
import org.junit.runner.RunWith;
import org.junit.runners.JUnit4;
import org.junit.jupiter.api.AfterEach;
import org.junit.jupiter.api.BeforeEach;
import org.junit.jupiter.api.Test;
import org.junit.jupiter.api.extension.RegisterExtension;
import org.junit.jupiter.api.io.TempDir;
/** Unit tests for {@link Transforms} related to loading CommitLogs. */
// TODO(weiminyu): Upgrade to JUnit5 when TestPipeline is upgraded. It is also easy to adapt with
// a wrapper.
@RunWith(JUnit4.class)
public class CommitLogTransformsTest implements Serializable {
class CommitLogTransformsTest implements Serializable {
private static final DateTime START_TIME = DateTime.parse("2000-01-01T00:00:00.0Z");
@Rule public final transient TemporaryFolder temporaryFolder = new TemporaryFolder();
@SuppressWarnings("WeakerAccess")
@TempDir
transient Path tmpDir;
@Rule public final transient InjectRule injectRule = new InjectRule();
@RegisterExtension final transient InjectExtension injectRule = new InjectExtension();
@Rule
public final transient TestPipeline pipeline =
TestPipeline.create().enableAbandonedNodeEnforcement(true);
@RegisterExtension
final transient TestPipelineExtension testPipeline =
TestPipelineExtension.create().enableAbandonedNodeEnforcement(true);
private FakeClock fakeClock;
private transient BackupTestStore store;
@@ -75,8 +73,8 @@ public class CommitLogTransformsTest implements Serializable {
private transient ContactResource contact;
private transient DomainBase domain;
@Before
public void beforeEach() throws Exception {
@BeforeEach
void beforeEach() throws Exception {
fakeClock = new FakeClock(START_TIME);
store = new BackupTestStore(fakeClock);
injectRule.setStaticField(Ofy.class, "clock", fakeClock);
@@ -92,12 +90,12 @@ public class CommitLogTransformsTest implements Serializable {
contact = (ContactResource) store.loadAsOfyEntity(contact);
domain = (DomainBase) store.loadAsOfyEntity(domain);
commitLogsDir = temporaryFolder.newFolder();
commitLogsDir = Files.createDirectory(tmpDir.resolve("commit_logs")).toFile();
firstCommitLogFile = store.saveCommitLogs(commitLogsDir.getAbsolutePath());
}
@After
public void afterEach() throws Exception {
@AfterEach
void afterEach() throws Exception {
if (store != null) {
store.close();
store = null;
@@ -105,10 +103,9 @@ public class CommitLogTransformsTest implements Serializable {
}
@Test
@Category(NeedsRunner.class)
public void getCommitLogFilePatterns() {
void getCommitLogFilePatterns() {
PCollection<String> patterns =
pipeline.apply(
testPipeline.apply(
"Get CommitLog file patterns",
Transforms.getCommitLogFilePatterns(commitLogsDir.getAbsolutePath()));
@@ -117,14 +114,13 @@ public class CommitLogTransformsTest implements Serializable {
PAssert.that(patterns).containsInAnyOrder(expectedPatterns);
pipeline.run();
testPipeline.run();
}
@Test
@Category(NeedsRunner.class)
public void getFilesByPatterns() {
void getFilesByPatterns() {
PCollection<Metadata> fileMetas =
pipeline
testPipeline
.apply(
"File patterns to metadata",
Create.of(commitLogsDir.getAbsolutePath() + "/commit_diff_until_*")
@@ -149,12 +145,11 @@ public class CommitLogTransformsTest implements Serializable {
PAssert.that(fileNames).containsInAnyOrder(expectedFilenames);
pipeline.run();
testPipeline.run();
}
@Test
@Category(NeedsRunner.class)
public void filterCommitLogsByTime() throws IOException {
void filterCommitLogsByTime() throws IOException {
ImmutableList<String> commitLogFilenames =
ImmutableList.of(
"commit_diff_until_2000-01-01T00:00:00.000Z",
@@ -163,16 +158,15 @@ public class CommitLogTransformsTest implements Serializable {
"commit_diff_until_2000-01-01T00:00:00.003Z",
"commit_diff_until_2000-01-01T00:00:00.004Z");
File commitLogDir = temporaryFolder.newFolder();
for (String name : commitLogFilenames) {
new File(commitLogDir, name).createNewFile();
new File(commitLogsDir, name).createNewFile();
}
PCollection<String> filteredFilenames =
pipeline
testPipeline
.apply(
"Get commitlog file patterns",
Transforms.getCommitLogFilePatterns(commitLogDir.getAbsolutePath()))
Transforms.getCommitLogFilePatterns(commitLogsDir.getAbsolutePath()))
.apply("Find commitlog files", Transforms.getFilesByPatterns())
.apply(
"Filtered by Time",
@@ -194,14 +188,13 @@ public class CommitLogTransformsTest implements Serializable {
"commit_diff_until_2000-01-01T00:00:00.001Z",
"commit_diff_until_2000-01-01T00:00:00.002Z");
pipeline.run();
testPipeline.run();
}
@Test
@Category(NeedsRunner.class)
public void loadOneCommitLogFile() {
void loadOneCommitLogFile() {
PCollection<VersionedEntity> entities =
pipeline
testPipeline
.apply(
"Get CommitLog file patterns",
Transforms.getCommitLogFilePatterns(commitLogsDir.getAbsolutePath()))
@@ -216,14 +209,13 @@ public class CommitLogTransformsTest implements Serializable {
KV.of(fakeClock.nowUtc().getMillis() - 1, store.loadAsDatastoreEntity(contact)),
KV.of(fakeClock.nowUtc().getMillis() - 1, store.loadAsDatastoreEntity(domain)));
pipeline.run();
testPipeline.run();
}
@Test
@Category(NeedsRunner.class)
public void loadOneCommitLogFile_filterByKind() {
void loadOneCommitLogFile_filterByKind() {
PCollection<VersionedEntity> entities =
pipeline
testPipeline
.apply(
"Get CommitLog file patterns",
Transforms.getCommitLogFilePatterns(commitLogsDir.getAbsolutePath()))
@@ -236,6 +228,6 @@ public class CommitLogTransformsTest implements Serializable {
KV.of(fakeClock.nowUtc().getMillis() - 2, store.loadAsDatastoreEntity(registry)),
KV.of(fakeClock.nowUtc().getMillis() - 1, store.loadAsDatastoreEntity(contact)));
pipeline.run();
testPipeline.run();
}
}

View File

@@ -0,0 +1,221 @@
// Copyright 2020 The Nomulus Authors. All Rights Reserved.
//
// Licensed under the Apache License, Version 2.0 (the "License");
// you may not use this file except in compliance with the License.
// You may obtain a copy of the License at
//
// http://www.apache.org/licenses/LICENSE-2.0
//
// Unless required by applicable law or agreed to in writing, software
// distributed under the License is distributed on an "AS IS" BASIS,
// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
// See the License for the specific language governing permissions and
// limitations under the License.
package google.registry.beam.initsql;
import static google.registry.model.ImmutableObjectSubject.assertAboutImmutableObjects;
import static google.registry.model.ofy.ObjectifyService.ofy;
import static google.registry.persistence.transaction.TransactionManagerFactory.tm;
import static google.registry.testing.DatastoreHelper.cloneAndSetAutoTimestamps;
import static google.registry.testing.DatastoreHelper.createTld;
import static google.registry.testing.DatastoreHelper.persistResource;
import static google.registry.util.DateTimeUtils.START_OF_TIME;
import static org.junit.Assert.assertThrows;
import com.google.appengine.api.datastore.Entity;
import com.google.common.collect.ImmutableSet;
import com.googlecode.objectify.Key;
import google.registry.model.billing.BillingEvent;
import google.registry.model.billing.BillingEvent.OneTime;
import google.registry.model.contact.ContactResource;
import google.registry.model.domain.DesignatedContact;
import google.registry.model.domain.DomainAuthInfo;
import google.registry.model.domain.DomainBase;
import google.registry.model.domain.GracePeriod;
import google.registry.model.domain.launch.LaunchNotice;
import google.registry.model.domain.rgp.GracePeriodStatus;
import google.registry.model.domain.secdns.DelegationSignerData;
import google.registry.model.eppcommon.AuthInfo.PasswordAuth;
import google.registry.model.eppcommon.StatusValue;
import google.registry.model.eppcommon.Trid;
import google.registry.model.host.HostResource;
import google.registry.model.ofy.Ofy;
import google.registry.model.poll.PollMessage;
import google.registry.model.reporting.HistoryEntry;
import google.registry.model.transfer.DomainTransferData;
import google.registry.model.transfer.TransferStatus;
import google.registry.persistence.VKey;
import google.registry.testing.AppEngineExtension;
import google.registry.testing.DatastoreHelper;
import google.registry.testing.FakeClock;
import google.registry.testing.InjectExtension;
import org.joda.time.Instant;
import org.junit.jupiter.api.BeforeEach;
import org.junit.jupiter.api.Test;
import org.junit.jupiter.api.extension.RegisterExtension;
/** Unit tests for {@link DomainBaseUtil}. */
public class DomainBaseUtilTest {
private final FakeClock fakeClock = new FakeClock(Instant.now());
private DomainBase domain;
private Entity domainEntity;
private Key<OneTime> oneTimeBillKey;
private VKey<BillingEvent.Recurring> recurringBillKey;
private Key<DomainBase> domainKey;
@RegisterExtension
AppEngineExtension appEngineRule =
AppEngineExtension.builder().withDatastore().withClock(fakeClock).build();
@RegisterExtension InjectExtension injectRule = new InjectExtension();
@BeforeEach
void beforeEach() {
injectRule.setStaticField(Ofy.class, "clock", fakeClock);
createTld("com");
domainKey = Key.create(null, DomainBase.class, "4-COM");
VKey<HostResource> hostKey =
persistResource(
new HostResource.Builder()
.setHostName("ns1.example.com")
.setSuperordinateDomain(VKey.from(domainKey))
.setRepoId("1-COM")
.build())
.createVKey();
VKey<ContactResource> contact1Key =
persistResource(
new ContactResource.Builder()
.setContactId("contact_id1")
.setRepoId("2-COM")
.build())
.createVKey();
VKey<ContactResource> contact2Key =
persistResource(
new ContactResource.Builder()
.setContactId("contact_id2")
.setRepoId("3-COM")
.build())
.createVKey();
Key<HistoryEntry> historyEntryKey =
Key.create(persistResource(new HistoryEntry.Builder().setParent(domainKey).build()));
oneTimeBillKey = Key.create(historyEntryKey, BillingEvent.OneTime.class, 1);
recurringBillKey = VKey.from(Key.create(historyEntryKey, BillingEvent.Recurring.class, 2));
VKey<PollMessage.Autorenew> autorenewPollKey =
VKey.from(Key.create(historyEntryKey, PollMessage.Autorenew.class, 3));
VKey<PollMessage.OneTime> onetimePollKey =
VKey.from(Key.create(historyEntryKey, PollMessage.OneTime.class, 1));
// Set up a new persisted domain entity.
domain =
persistResource(
cloneAndSetAutoTimestamps(
new DomainBase.Builder()
.setDomainName("example.com")
.setRepoId("4-COM")
.setCreationClientId("a registrar")
.setLastEppUpdateTime(fakeClock.nowUtc())
.setLastEppUpdateClientId("AnotherRegistrar")
.setLastTransferTime(fakeClock.nowUtc())
.setStatusValues(
ImmutableSet.of(
StatusValue.CLIENT_DELETE_PROHIBITED,
StatusValue.SERVER_DELETE_PROHIBITED,
StatusValue.SERVER_TRANSFER_PROHIBITED,
StatusValue.SERVER_UPDATE_PROHIBITED,
StatusValue.SERVER_RENEW_PROHIBITED,
StatusValue.SERVER_HOLD))
.setRegistrant(contact1Key)
.setContacts(
ImmutableSet.of(
DesignatedContact.create(DesignatedContact.Type.ADMIN, contact2Key)))
.setNameservers(ImmutableSet.of(hostKey))
.setSubordinateHosts(ImmutableSet.of("ns1.example.com"))
.setPersistedCurrentSponsorClientId("losing")
.setRegistrationExpirationTime(fakeClock.nowUtc().plusYears(1))
.setAuthInfo(DomainAuthInfo.create(PasswordAuth.create("password")))
.setDsData(
ImmutableSet.of(DelegationSignerData.create(1, 2, 3, new byte[] {0, 1, 2})))
.setLaunchNotice(
LaunchNotice.create("tcnid", "validatorId", START_OF_TIME, START_OF_TIME))
.setTransferData(
new DomainTransferData.Builder()
.setGainingClientId("gaining")
.setLosingClientId("losing")
.setPendingTransferExpirationTime(fakeClock.nowUtc())
.setServerApproveEntities(
ImmutableSet.of(
VKey.from(oneTimeBillKey), recurringBillKey, autorenewPollKey))
.setServerApproveBillingEvent(VKey.from(oneTimeBillKey))
.setServerApproveAutorenewEvent(recurringBillKey)
.setServerApproveAutorenewPollMessage(autorenewPollKey)
.setTransferRequestTime(fakeClock.nowUtc().plusDays(1))
.setTransferStatus(TransferStatus.SERVER_APPROVED)
.setTransferRequestTrid(Trid.create("client-trid", "server-trid"))
.build())
.setDeletePollMessage(onetimePollKey)
.setAutorenewBillingEvent(recurringBillKey)
.setAutorenewPollMessage(autorenewPollKey)
.setSmdId("smdid")
.addGracePeriod(
GracePeriod.create(
GracePeriodStatus.ADD,
fakeClock.nowUtc().plusDays(1),
"registrar",
null))
.build()));
domainEntity = tm().transact(() -> ofy().toEntity(domain));
}
@Test
void removeBillingAndPollAndHosts_allFkeysPresent() {
DomainBase domainTransformedByOfy =
domain
.asBuilder()
.setAutorenewBillingEvent(null)
.setAutorenewPollMessage(null)
.setNameservers(ImmutableSet.of())
.setDeletePollMessage(null)
.setTransferData(null)
.build();
DomainBase domainTransformedByUtil =
(DomainBase) ofy().toPojo(DomainBaseUtil.removeBillingAndPollAndHosts(domainEntity));
// Compensates for the missing INACTIVE status.
domainTransformedByUtil = domainTransformedByUtil.asBuilder().build();
assertAboutImmutableObjects()
.that(domainTransformedByUtil)
.isEqualExceptFields(domainTransformedByOfy, "revisions");
}
@Test
void removeBillingAndPollAndHosts_noFkeysPresent() {
DomainBase domainWithoutFKeys =
domain
.asBuilder()
.setAutorenewBillingEvent(null)
.setAutorenewPollMessage(null)
.setNameservers(ImmutableSet.of())
.setDeletePollMessage(null)
.setTransferData(null)
.build();
Entity entityWithoutFkeys = tm().transact(() -> ofy().toEntity(domainWithoutFKeys));
DomainBase domainTransformedByUtil =
(DomainBase) ofy().toPojo(DomainBaseUtil.removeBillingAndPollAndHosts(entityWithoutFkeys));
// Compensates for the missing INACTIVE status.
domainTransformedByUtil = domainTransformedByUtil.asBuilder().build();
assertAboutImmutableObjects()
.that(domainTransformedByUtil)
.isEqualExceptFields(domainWithoutFKeys, "revisions");
}
@Test
void removeBillingAndPollAndHosts_notDomainBase() {
Entity contactEntity =
tm().transact(() -> ofy().toEntity(DatastoreHelper.newContactResource("contact")));
assertThrows(
IllegalArgumentException.class,
() -> DomainBaseUtil.removeBillingAndPollAndHosts(contactEntity));
}
}

View File

@@ -22,44 +22,39 @@ import com.google.common.collect.ImmutableList;
import com.google.common.collect.ImmutableSet;
import com.googlecode.objectify.Key;
import google.registry.backup.VersionedEntity;
import google.registry.beam.TestPipelineExtension;
import google.registry.model.contact.ContactResource;
import google.registry.model.domain.DomainBase;
import google.registry.model.ofy.Ofy;
import google.registry.model.registry.Registry;
import google.registry.testing.FakeClock;
import google.registry.testing.InjectRule;
import google.registry.testing.InjectExtension;
import java.io.File;
import java.io.Serializable;
import java.nio.file.Path;
import java.util.Collections;
import org.apache.beam.sdk.coders.StringUtf8Coder;
import org.apache.beam.sdk.io.fs.MatchResult.Metadata;
import org.apache.beam.sdk.testing.NeedsRunner;
import org.apache.beam.sdk.testing.PAssert;
import org.apache.beam.sdk.testing.TestPipeline;
import org.apache.beam.sdk.transforms.Create;
import org.apache.beam.sdk.transforms.DoFn;
import org.apache.beam.sdk.transforms.ParDo;
import org.apache.beam.sdk.values.KV;
import org.apache.beam.sdk.values.PCollection;
import org.joda.time.DateTime;
import org.junit.After;
import org.junit.Before;
import org.junit.Rule;
import org.junit.Test;
import org.junit.experimental.categories.Category;
import org.junit.rules.TemporaryFolder;
import org.junit.runner.RunWith;
import org.junit.runners.JUnit4;
import org.junit.jupiter.api.AfterEach;
import org.junit.jupiter.api.BeforeEach;
import org.junit.jupiter.api.Test;
import org.junit.jupiter.api.extension.RegisterExtension;
import org.junit.jupiter.api.io.TempDir;
/**
* Unit tests for {@link Transforms} related to loading Datastore exports.
*
* <p>This class implements {@link Serializable} so that test {@link DoFn} classes may be inlined.
*/
// TODO(weiminyu): Upgrade to JUnit5 when TestPipeline is upgraded. It is also easy to adapt with
// a wrapper.
@RunWith(JUnit4.class)
public class ExportloadingTransformsTest implements Serializable {
class ExportloadingTransformsTest implements Serializable {
private static final DateTime START_TIME = DateTime.parse("2000-01-01T00:00:00.0Z");
private static final ImmutableList<Class<?>> ALL_KINDS =
@@ -67,13 +62,15 @@ public class ExportloadingTransformsTest implements Serializable {
private static final ImmutableSet<String> ALL_KIND_STRS =
ALL_KINDS.stream().map(Key::getKind).collect(ImmutableSet.toImmutableSet());
@Rule public final transient TemporaryFolder exportRootDir = new TemporaryFolder();
@SuppressWarnings("WeakerAccess")
@TempDir
transient Path tmpDir;
@Rule public final transient InjectRule injectRule = new InjectRule();
@RegisterExtension final transient InjectExtension injectRule = new InjectExtension();
@Rule
public final transient TestPipeline pipeline =
TestPipeline.create().enableAbandonedNodeEnforcement(true);
@RegisterExtension
final transient TestPipelineExtension testPipeline =
TestPipelineExtension.create().enableAbandonedNodeEnforcement(true);
private FakeClock fakeClock;
private transient BackupTestStore store;
@@ -84,8 +81,8 @@ public class ExportloadingTransformsTest implements Serializable {
private transient ContactResource contact;
private transient DomainBase domain;
@Before
public void beforeEach() throws Exception {
@BeforeEach
void beforeEach() throws Exception {
fakeClock = new FakeClock(START_TIME);
store = new BackupTestStore(fakeClock);
injectRule.setStaticField(Ofy.class, "clock", fakeClock);
@@ -102,12 +99,11 @@ public class ExportloadingTransformsTest implements Serializable {
contact = (ContactResource) store.loadAsOfyEntity(contact);
domain = (DomainBase) store.loadAsOfyEntity(domain);
exportDir =
store.export(exportRootDir.getRoot().getAbsolutePath(), ALL_KINDS, Collections.EMPTY_SET);
exportDir = store.export(tmpDir.toAbsolutePath().toString(), ALL_KINDS, Collections.EMPTY_SET);
}
@After
public void afterEach() throws Exception {
@AfterEach
void afterEach() throws Exception {
if (store != null) {
store.close();
store = null;
@@ -115,10 +111,9 @@ public class ExportloadingTransformsTest implements Serializable {
}
@Test
@Category(NeedsRunner.class)
public void getExportFilePatterns() {
void getExportFilePatterns() {
PCollection<String> patterns =
pipeline.apply(
testPipeline.apply(
"Get Datastore file patterns",
Transforms.getDatastoreExportFilePatterns(exportDir.getAbsolutePath(), ALL_KIND_STRS));
@@ -130,14 +125,13 @@ public class ExportloadingTransformsTest implements Serializable {
PAssert.that(patterns).containsInAnyOrder(expectedPatterns);
pipeline.run();
testPipeline.run();
}
@Test
@Category(NeedsRunner.class)
public void getFilesByPatterns() {
void getFilesByPatterns() {
PCollection<Metadata> fileMetas =
pipeline
testPipeline
.apply(
"File patterns to metadata",
Create.of(
@@ -169,14 +163,13 @@ public class ExportloadingTransformsTest implements Serializable {
PAssert.that(fileNames).containsInAnyOrder(expectedFilenames);
pipeline.run();
testPipeline.run();
}
@Test
@Category(NeedsRunner.class)
public void loadDataFromFiles() {
void loadDataFromFiles() {
PCollection<VersionedEntity> entities =
pipeline
testPipeline
.apply(
"Get Datastore file patterns",
Transforms.getDatastoreExportFilePatterns(
@@ -190,6 +183,6 @@ public class ExportloadingTransformsTest implements Serializable {
KV.of(Transforms.EXPORT_ENTITY_TIME_STAMP, store.loadAsDatastoreEntity(contact)),
KV.of(Transforms.EXPORT_ENTITY_TIME_STAMP, store.loadAsDatastoreEntity(domain)));
pipeline.run();
testPipeline.run();
}
}

View File

@@ -0,0 +1,67 @@
// Copyright 2020 The Nomulus Authors. All Rights Reserved.
//
// Licensed under the Apache License, Version 2.0 (the "License");
// you may not use this file except in compliance with the License.
// You may obtain a copy of the License at
//
// http://www.apache.org/licenses/LICENSE-2.0
//
// Unless required by applicable law or agreed to in writing, software
// distributed under the License is distributed on an "AS IS" BASIS,
// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
// See the License for the specific language governing permissions and
// limitations under the License.
package google.registry.beam.initsql;
import static google.registry.testing.truth.TextDiffSubject.assertWithMessageAboutUrlSource;
import com.google.common.io.Resources;
import google.registry.beam.TestPipelineExtension;
import java.io.File;
import java.io.IOException;
import java.io.PrintStream;
import java.net.URL;
import org.apache.beam.runners.core.construction.renderer.PipelineDotRenderer;
import org.apache.beam.sdk.options.PipelineOptionsFactory;
import org.junit.jupiter.api.Test;
import org.junit.jupiter.api.extension.RegisterExtension;
/** Manages visualization of {@link InitSqlPipeline}. */
class InitSqlPipelineGraphTest {
private static final String GOLDEN_DOT_FILE = "pipeline_golden.dot";
private static final String[] OPTIONS_ARGS =
new String[] {
"--commitLogStartTimestamp=2000-01-01TZ",
"--commitLogEndTimestamp=2000-01-02TZ",
"--datastoreExportDir=/somedir",
"--commitLogDir=/someotherdir",
"--environment=alpha"
};
private static final transient InitSqlPipelineOptions options =
PipelineOptionsFactory.fromArgs(OPTIONS_ARGS)
.withValidation()
.as(InitSqlPipelineOptions.class);
@RegisterExtension
final transient TestPipelineExtension testPipeline =
TestPipelineExtension.create().enableAbandonedNodeEnforcement(false);
@Test
public void createPipeline_compareGraph() throws IOException {
new InitSqlPipeline(options, testPipeline).setupPipeline();
String dotString = PipelineDotRenderer.toDotString(testPipeline);
URL goldenDotUrl = Resources.getResource(InitSqlPipelineGraphTest.class, GOLDEN_DOT_FILE);
File outputFile = new File(new File(goldenDotUrl.getFile()).getParent(), "pipeline_curr.dot");
try (PrintStream ps = new PrintStream(outputFile)) {
ps.print(dotString);
}
assertWithMessageAboutUrlSource(
"InitSqlPipeline graph changed. Run :core:updateInitSqlPipelineGraph to update.")
.that(outputFile.toURI().toURL())
.hasSameContentAs(goldenDotUrl);
}
}

View File

@@ -1,4 +1,4 @@
// Copyright 2017 The Nomulus Authors. All Rights Reserved.
// Copyright 2020 The Nomulus Authors. All Rights Reserved.
//
// Licensed under the Apache License, Version 2.0 (the "License");
// you may not use this file except in compliance with the License.
@@ -12,20 +12,16 @@
// See the License for the specific language governing permissions and
// limitations under the License.
package google.registry.bigquery;
package google.registry.beam.initsql;
import org.junit.Test;
import org.junit.runner.RunWith;
import org.junit.runners.JUnit4;
import org.apache.beam.sdk.options.PipelineOptionsFactory;
import org.junit.jupiter.api.Test;
/** Unit tests for {@link BigqueryConnection}. */
@RunWith(JUnit4.class)
public class BigqueryConnectionTest {
/** Unit tests for {@link google.registry.beam.initsql.InitSqlPipelineOptions}. * */
public class InitSqlPipelineOptionsTest {
@Test
public void testNothing() {
// Placeholder test class for now.
// TODO(b/16569089): figure out a good way for testing our Bigquery usage overall - maybe unit
// tests here, maybe end-to-end testing.
void registerToValidate() {
PipelineOptionsFactory.register(InitSqlPipelineOptions.class);
}
}

View File

@@ -0,0 +1,281 @@
// Copyright 2020 The Nomulus Authors. All Rights Reserved.
//
// Licensed under the Apache License, Version 2.0 (the "License");
// you may not use this file except in compliance with the License.
// You may obtain a copy of the License at
//
// http://www.apache.org/licenses/LICENSE-2.0
//
// Unless required by applicable law or agreed to in writing, software
// distributed under the License is distributed on an "AS IS" BASIS,
// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
// See the License for the specific language governing permissions and
// limitations under the License.
package google.registry.beam.initsql;
import static com.google.common.truth.Truth.assertThat;
import static google.registry.model.ImmutableObjectSubject.assertAboutImmutableObjects;
import static google.registry.model.ImmutableObjectSubject.immutableObjectCorrespondence;
import static google.registry.persistence.transaction.TransactionManagerFactory.jpaTm;
import static google.registry.testing.DatastoreHelper.newRegistry;
import static google.registry.testing.DatastoreHelper.persistResource;
import static google.registry.util.DateTimeUtils.START_OF_TIME;
import com.google.common.collect.ImmutableList;
import com.google.common.collect.ImmutableSet;
import com.googlecode.objectify.Key;
import google.registry.backup.AppEngineEnvironment;
import google.registry.beam.TestPipelineExtension;
import google.registry.model.billing.BillingEvent;
import google.registry.model.contact.ContactResource;
import google.registry.model.domain.DesignatedContact;
import google.registry.model.domain.DomainAuthInfo;
import google.registry.model.domain.DomainBase;
import google.registry.model.domain.GracePeriod;
import google.registry.model.domain.launch.LaunchNotice;
import google.registry.model.domain.rgp.GracePeriodStatus;
import google.registry.model.domain.secdns.DelegationSignerData;
import google.registry.model.eppcommon.AuthInfo.PasswordAuth;
import google.registry.model.eppcommon.StatusValue;
import google.registry.model.eppcommon.Trid;
import google.registry.model.host.HostResource;
import google.registry.model.ofy.Ofy;
import google.registry.model.poll.PollMessage;
import google.registry.model.registrar.Registrar;
import google.registry.model.registry.Registry;
import google.registry.model.reporting.HistoryEntry;
import google.registry.model.transfer.DomainTransferData;
import google.registry.model.transfer.TransferStatus;
import google.registry.persistence.VKey;
import google.registry.persistence.transaction.JpaTestRules;
import google.registry.persistence.transaction.JpaTestRules.JpaIntegrationTestExtension;
import google.registry.testing.AppEngineExtension;
import google.registry.testing.DatastoreEntityExtension;
import google.registry.testing.FakeClock;
import google.registry.testing.InjectExtension;
import java.io.File;
import java.nio.file.Files;
import java.nio.file.Path;
import org.apache.beam.sdk.options.PipelineOptionsFactory;
import org.joda.time.DateTime;
import org.junit.jupiter.api.BeforeEach;
import org.junit.jupiter.api.Order;
import org.junit.jupiter.api.Test;
import org.junit.jupiter.api.extension.RegisterExtension;
import org.junit.jupiter.api.io.TempDir;
/** Unit tests for {@link InitSqlPipeline}. */
class InitSqlPipelineTest {
private static final DateTime START_TIME = DateTime.parse("2000-01-01T00:00:00.0Z");
private static final ImmutableList<Class<?>> ALL_KINDS =
ImmutableList.of(
Registry.class,
Registrar.class,
ContactResource.class,
HostResource.class,
DomainBase.class,
HistoryEntry.class);
private transient FakeClock fakeClock = new FakeClock(START_TIME);
@RegisterExtension
@Order(Order.DEFAULT - 1)
final transient DatastoreEntityExtension datastore = new DatastoreEntityExtension();
@RegisterExtension final transient InjectExtension injectRule = new InjectExtension();
@SuppressWarnings("WeakerAccess")
@TempDir
transient Path tmpDir;
@RegisterExtension
final transient TestPipelineExtension testPipeline =
TestPipelineExtension.create().enableAbandonedNodeEnforcement(true);
@RegisterExtension
final transient JpaIntegrationTestExtension database =
new JpaTestRules.Builder().withClock(fakeClock).buildIntegrationTestRule();
// Must not be transient!
@RegisterExtension
@Order(Order.DEFAULT + 1)
final BeamJpaExtension beamJpaExtension =
new BeamJpaExtension(() -> tmpDir.resolve("credential.dat"), database.getDatabase());
private File exportRootDir;
private File exportDir;
private File commitLogDir;
private transient Registrar registrar1;
private transient Registrar registrar2;
private transient DomainBase domain;
private transient ContactResource contact1;
private transient ContactResource contact2;
private transient HostResource hostResource;
private transient HistoryEntry historyEntry;
@BeforeEach
void beforeEach() throws Exception {
try (BackupTestStore store = new BackupTestStore(fakeClock)) {
injectRule.setStaticField(Ofy.class, "clock", fakeClock);
exportRootDir = Files.createDirectory(tmpDir.resolve("exports")).toFile();
persistResource(newRegistry("com", "COM"));
registrar1 = persistResource(AppEngineExtension.makeRegistrar1());
registrar2 = persistResource(AppEngineExtension.makeRegistrar2());
Key<DomainBase> domainKey = Key.create(null, DomainBase.class, "4-COM");
hostResource =
persistResource(
new HostResource.Builder()
.setHostName("ns1.example.com")
.setSuperordinateDomain(VKey.from(domainKey))
.setRepoId("1-COM")
.setCreationClientId(registrar1.getClientId())
.setPersistedCurrentSponsorClientId(registrar2.getClientId())
.build());
contact1 =
persistResource(
new ContactResource.Builder()
.setContactId("contact_id1")
.setRepoId("2-COM")
.setCreationClientId(registrar1.getClientId())
.setPersistedCurrentSponsorClientId(registrar2.getClientId())
.build());
contact2 =
persistResource(
new ContactResource.Builder()
.setContactId("contact_id2")
.setRepoId("3-COM")
.setCreationClientId(registrar1.getClientId())
.setPersistedCurrentSponsorClientId(registrar1.getClientId())
.build());
historyEntry = persistResource(new HistoryEntry.Builder().setParent(domainKey).build());
Key<HistoryEntry> historyEntryKey = Key.create(historyEntry);
Key<BillingEvent.OneTime> oneTimeBillKey =
Key.create(historyEntryKey, BillingEvent.OneTime.class, 1);
VKey<BillingEvent.Recurring> recurringBillKey =
VKey.from(Key.create(historyEntryKey, BillingEvent.Recurring.class, 2));
VKey<PollMessage.Autorenew> autorenewPollKey =
VKey.from(Key.create(historyEntryKey, PollMessage.Autorenew.class, 3));
VKey<PollMessage.OneTime> onetimePollKey =
VKey.from(Key.create(historyEntryKey, PollMessage.OneTime.class, 1));
domain =
persistResource(
new DomainBase.Builder()
.setDomainName("example.com")
.setRepoId("4-COM")
.setCreationClientId(registrar1.getClientId())
.setLastEppUpdateTime(fakeClock.nowUtc())
.setLastEppUpdateClientId(registrar2.getClientId())
.setLastTransferTime(fakeClock.nowUtc())
.setStatusValues(
ImmutableSet.of(
StatusValue.CLIENT_DELETE_PROHIBITED,
StatusValue.SERVER_DELETE_PROHIBITED,
StatusValue.SERVER_TRANSFER_PROHIBITED,
StatusValue.SERVER_UPDATE_PROHIBITED,
StatusValue.SERVER_RENEW_PROHIBITED,
StatusValue.SERVER_HOLD))
.setRegistrant(contact1.createVKey())
.setContacts(
ImmutableSet.of(
DesignatedContact.create(
DesignatedContact.Type.ADMIN, contact2.createVKey())))
.setNameservers(ImmutableSet.of(hostResource.createVKey()))
.setSubordinateHosts(ImmutableSet.of("ns1.example.com"))
.setPersistedCurrentSponsorClientId(registrar2.getClientId())
.setRegistrationExpirationTime(fakeClock.nowUtc().plusYears(1))
.setAuthInfo(DomainAuthInfo.create(PasswordAuth.create("password")))
.setDsData(
ImmutableSet.of(DelegationSignerData.create(1, 2, 3, new byte[] {0, 1, 2})))
.setLaunchNotice(
LaunchNotice.create("tcnid", "validatorId", START_OF_TIME, START_OF_TIME))
.setTransferData(
new DomainTransferData.Builder()
.setGainingClientId(registrar1.getClientId())
.setLosingClientId(registrar2.getClientId())
.setPendingTransferExpirationTime(fakeClock.nowUtc())
.setServerApproveEntities(
ImmutableSet.of(
VKey.from(oneTimeBillKey), recurringBillKey, autorenewPollKey))
.setServerApproveBillingEvent(VKey.from(oneTimeBillKey))
.setServerApproveAutorenewEvent(recurringBillKey)
.setServerApproveAutorenewPollMessage(autorenewPollKey)
.setTransferRequestTime(fakeClock.nowUtc().plusDays(1))
.setTransferStatus(TransferStatus.SERVER_APPROVED)
.setTransferRequestTrid(Trid.create("client-trid", "server-trid"))
.build())
.setDeletePollMessage(onetimePollKey)
.setAutorenewBillingEvent(recurringBillKey)
.setAutorenewPollMessage(autorenewPollKey)
.setSmdId("smdid")
.addGracePeriod(
GracePeriod.create(
GracePeriodStatus.ADD, fakeClock.nowUtc().plusDays(1), "registrar", null))
.build());
exportDir = store.export(exportRootDir.getAbsolutePath(), ALL_KINDS, ImmutableSet.of());
commitLogDir = Files.createDirectory(tmpDir.resolve("commits")).toFile();
}
}
@Test
void runPipeline() {
InitSqlPipelineOptions options =
PipelineOptionsFactory.fromArgs(
"--sqlCredentialUrlOverride="
+ beamJpaExtension.getCredentialFile().getAbsolutePath(),
"--commitLogStartTimestamp=" + START_TIME,
"--commitLogEndTimestamp=" + fakeClock.nowUtc().plusMillis(1),
"--datastoreExportDir=" + exportDir.getAbsolutePath(),
"--commitLogDir=" + commitLogDir.getAbsolutePath())
.withValidation()
.as(InitSqlPipelineOptions.class);
InitSqlPipeline initSqlPipeline = new InitSqlPipeline(options, testPipeline);
initSqlPipeline.run().waitUntilFinish();
try (AppEngineEnvironment env = new AppEngineEnvironment("test")) {
assertHostResourceEquals(
jpaTm().transact(() -> jpaTm().load(hostResource.createVKey())), hostResource);
assertThat(jpaTm().transact(() -> jpaTm().loadAll(Registrar.class)))
.comparingElementsUsing(immutableObjectCorrespondence("lastUpdateTime"))
.containsExactly(registrar1, registrar2);
assertThat(jpaTm().transact(() -> jpaTm().loadAll(ContactResource.class)))
.comparingElementsUsing(immutableObjectCorrespondence("revisions", "updateTimestamp"))
.containsExactly(contact1, contact2);
assertCleansedDomainEquals(jpaTm().transact(() -> jpaTm().load(domain.createVKey())), domain);
}
}
private static void assertHostResourceEquals(HostResource actual, HostResource expected) {
assertAboutImmutableObjects()
.that(actual)
.isEqualExceptFields(expected, "superordinateDomain", "revisions", "updateTimestamp");
assertThat(actual.getSuperordinateDomain().getSqlKey())
.isEqualTo(expected.getSuperordinateDomain().getSqlKey());
}
private static void assertCleansedDomainEquals(DomainBase actual, DomainBase expected) {
assertAboutImmutableObjects()
.that(actual)
.isEqualExceptFields(
expected,
"adminContact",
"registrantContact",
"gracePeriods",
"dsData",
"allContacts",
"revisions",
"updateTimestamp",
"autorenewBillingEvent",
"autorenewPollMessage",
"deletePollMessage",
"nsHosts",
"transferData");
assertThat(actual.getAdminContact().getSqlKey())
.isEqualTo(expected.getAdminContact().getSqlKey());
assertThat(actual.getRegistrant().getSqlKey()).isEqualTo(expected.getRegistrant().getSqlKey());
// TODO(weiminyu): compare gracePeriods, allContacts and dsData, when SQL model supports them.
}
}

View File

@@ -22,6 +22,7 @@ import com.google.appengine.api.datastore.Entity;
import com.google.common.collect.ImmutableList;
import com.google.common.collect.ImmutableSet;
import com.googlecode.objectify.Key;
import google.registry.beam.TestPipelineExtension;
import google.registry.model.contact.ContactResource;
import google.registry.model.domain.DomainAuthInfo;
import google.registry.model.domain.DomainBase;
@@ -29,20 +30,17 @@ import google.registry.model.eppcommon.AuthInfo.PasswordAuth;
import google.registry.model.ofy.Ofy;
import google.registry.model.registry.Registry;
import google.registry.testing.FakeClock;
import google.registry.testing.InjectRule;
import google.registry.testing.InjectExtension;
import java.io.File;
import org.apache.beam.sdk.testing.NeedsRunner;
import org.apache.beam.sdk.testing.TestPipeline;
import java.nio.file.Files;
import java.nio.file.Path;
import org.apache.beam.sdk.values.KV;
import org.apache.beam.sdk.values.PCollectionTuple;
import org.joda.time.DateTime;
import org.junit.Before;
import org.junit.Rule;
import org.junit.Test;
import org.junit.experimental.categories.Category;
import org.junit.rules.TemporaryFolder;
import org.junit.runner.RunWith;
import org.junit.runners.JUnit4;
import org.junit.jupiter.api.BeforeEach;
import org.junit.jupiter.api.Test;
import org.junit.jupiter.api.extension.RegisterExtension;
import org.junit.jupiter.api.io.TempDir;
/**
* Unit test for {@link Transforms#loadDatastoreSnapshot}.
@@ -71,8 +69,8 @@ import org.junit.runners.JUnit4;
* <li>Deletes are properly handled.
* </ul>
*/
@RunWith(JUnit4.class)
public class LoadDatastoreSnapshotTest {
class LoadDatastoreSnapshotTest {
private static final DateTime START_TIME = DateTime.parse("2000-01-01T00:00:00.0Z");
private static final ImmutableList<Class<?>> ALL_KINDS =
@@ -80,13 +78,15 @@ public class LoadDatastoreSnapshotTest {
private static final ImmutableSet<String> ALL_KIND_STRS =
ALL_KINDS.stream().map(Key::getKind).collect(ImmutableSet.toImmutableSet());
@Rule public final transient TemporaryFolder temporaryFolder = new TemporaryFolder();
@SuppressWarnings("WeakerAccess")
@TempDir
transient Path tmpDir;
@Rule public final transient InjectRule injectRule = new InjectRule();
@RegisterExtension final transient InjectExtension injectRule = new InjectExtension();
@Rule
public final transient TestPipeline pipeline =
TestPipeline.create().enableAbandonedNodeEnforcement(true);
@RegisterExtension
final transient TestPipelineExtension testPipeline =
TestPipelineExtension.create().enableAbandonedNodeEnforcement(true);
private FakeClock fakeClock;
private File exportRootDir;
@@ -102,14 +102,14 @@ public class LoadDatastoreSnapshotTest {
private transient DateTime contactLastUpdateTime;
private transient DateTime domainLastUpdateTime;
@Before
public void beforeEach() throws Exception {
@BeforeEach
void beforeEach() throws Exception {
fakeClock = new FakeClock(START_TIME);
try (BackupTestStore store = new BackupTestStore(fakeClock)) {
injectRule.setStaticField(Ofy.class, "clock", fakeClock);
exportRootDir = temporaryFolder.newFolder();
commitLogsDir = temporaryFolder.newFolder();
exportRootDir = Files.createDirectory(tmpDir.resolve("export_root")).toFile();
commitLogsDir = Files.createDirectory(tmpDir.resolve("commit_logs")).toFile();
Registry registry = newRegistry("tld1", "TLD1");
ContactResource fillerContact = newContactResource("contact_filler");
@@ -154,10 +154,9 @@ public class LoadDatastoreSnapshotTest {
}
@Test
@Category(NeedsRunner.class)
public void loadDatastoreSnapshot() {
void loadDatastoreSnapshot() {
PCollectionTuple snapshot =
pipeline.apply(
testPipeline.apply(
Transforms.loadDatastoreSnapshot(
exportDir.getAbsolutePath(),
commitLogsDir.getAbsolutePath(),
@@ -173,6 +172,6 @@ public class LoadDatastoreSnapshotTest {
InitSqlTestUtils.assertContainsExactlyElementsIn(
snapshot.get(Transforms.createTagForKind("ContactResource")),
KV.of(contactLastUpdateTime.getMillis(), dsContact));
pipeline.run();
testPipeline.run();
}
}

View File

@@ -15,72 +15,75 @@
package google.registry.beam.initsql;
import static com.google.common.truth.Truth.assertThat;
import static google.registry.model.ImmutableObjectSubject.immutableObjectCorrespondence;
import static google.registry.persistence.transaction.TransactionManagerFactory.jpaTm;
import com.google.appengine.api.datastore.Entity;
import com.google.common.collect.ImmutableList;
import google.registry.backup.VersionedEntity;
import google.registry.beam.TestPipelineExtension;
import google.registry.model.ImmutableObject;
import google.registry.model.contact.ContactResource;
import google.registry.model.ofy.Ofy;
import google.registry.model.registrar.Registrar;
import google.registry.persistence.transaction.JpaTestRules;
import google.registry.persistence.transaction.JpaTestRules.JpaIntegrationTestRule;
import google.registry.testing.AppEngineRule;
import google.registry.persistence.transaction.JpaTestRules.JpaIntegrationTestExtension;
import google.registry.testing.AppEngineExtension;
import google.registry.testing.DatastoreEntityExtension;
import google.registry.testing.DatastoreHelper;
import google.registry.testing.FakeClock;
import google.registry.testing.InjectRule;
import java.io.File;
import java.io.PrintStream;
import google.registry.testing.InjectExtension;
import java.io.Serializable;
import java.nio.file.Path;
import java.util.stream.Collectors;
import org.apache.beam.sdk.testing.NeedsRunner;
import org.apache.beam.sdk.testing.TestPipeline;
import org.apache.beam.sdk.transforms.Create;
import org.joda.time.DateTime;
import org.junit.Before;
import org.junit.Rule;
import org.junit.Test;
import org.junit.experimental.categories.Category;
import org.junit.rules.RuleChain;
import org.junit.rules.TemporaryFolder;
import org.junit.runner.RunWith;
import org.junit.runners.JUnit4;
import org.junit.jupiter.api.BeforeEach;
import org.junit.jupiter.api.Order;
import org.junit.jupiter.api.Test;
import org.junit.jupiter.api.extension.RegisterExtension;
import org.junit.jupiter.api.io.TempDir;
/** Unit test for {@link Transforms#writeToSql}. */
@RunWith(JUnit4.class)
public class WriteToSqlTest implements Serializable {
class WriteToSqlTest implements Serializable {
private static final DateTime START_TIME = DateTime.parse("2000-01-01T00:00:00.0Z");
private final FakeClock fakeClock = new FakeClock(START_TIME);
@Rule public final transient InjectRule injectRule = new InjectRule();
@RegisterExtension
@Order(Order.DEFAULT - 1)
final transient DatastoreEntityExtension datastore = new DatastoreEntityExtension();
// For use in the RuleChain below. Saves a reference to retrieve Database connection config.
public final transient JpaIntegrationTestRule database =
@RegisterExtension final transient InjectExtension injectRule = new InjectExtension();
@RegisterExtension
final transient JpaIntegrationTestExtension database =
new JpaTestRules.Builder().withClock(fakeClock).buildIntegrationTestRule();
@Rule
public final transient RuleChain jpaRules =
RuleChain.outerRule(new DatastoreEntityExtension()).around(database);
@SuppressWarnings("WeakerAccess")
@TempDir
transient Path tmpDir;
@Rule public transient TemporaryFolder temporaryFolder = new TemporaryFolder();
@RegisterExtension
final transient TestPipelineExtension testPipeline =
TestPipelineExtension.create().enableAbandonedNodeEnforcement(true);
@Rule
public final transient TestPipeline pipeline =
TestPipeline.create().enableAbandonedNodeEnforcement(true);
// Must not be transient!
@RegisterExtension
@Order(Order.DEFAULT + 1)
final BeamJpaExtension beamJpaExtension =
new BeamJpaExtension(() -> tmpDir.resolve("credential.dat"), database.getDatabase());
private ImmutableList<Entity> contacts;
private File credentialFile;
@Before
public void beforeEach() throws Exception {
@BeforeEach
void beforeEach() throws Exception {
try (BackupTestStore store = new BackupTestStore(fakeClock)) {
injectRule.setStaticField(Ofy.class, "clock", fakeClock);
// Required for contacts created below.
Registrar ofyRegistrar = AppEngineRule.makeRegistrar2();
Registrar ofyRegistrar = AppEngineExtension.makeRegistrar2();
store.insertOrUpdate(ofyRegistrar);
jpaTm().transact(() -> jpaTm().saveNewOrUpdate(store.loadAsOfyEntity(ofyRegistrar)));
@@ -93,20 +96,11 @@ public class WriteToSqlTest implements Serializable {
}
contacts = builder.build();
}
credentialFile = temporaryFolder.newFile();
new PrintStream(credentialFile)
.printf(
"%s %s %s",
database.getDatabaseUrl(),
database.getDatabaseUsername(),
database.getDatabasePassword())
.close();
}
@Test
@Category(NeedsRunner.class)
public void writeToSql_twoWriters() {
pipeline
void writeToSql_twoWriters() {
testPipeline
.apply(
Create.of(
contacts.stream()
@@ -120,14 +114,18 @@ public class WriteToSqlTest implements Serializable {
4,
() ->
DaggerBeamJpaModule_JpaTransactionManagerComponent.builder()
.beamJpaModule(new BeamJpaModule(credentialFile.getAbsolutePath()))
.beamJpaModule(beamJpaExtension.getBeamJpaModule())
.build()
.localDbJpaTransactionManager()));
pipeline.run().waitUntilFinish();
testPipeline.run().waitUntilFinish();
ImmutableList<?> sqlContacts = jpaTm().transact(() -> jpaTm().loadAll(ContactResource.class));
// TODO(weiminyu): compare load entities with originals. Note: lastUpdateTimes won't match by
// design. Need an elegant way to deal with this.bbq
assertThat(sqlContacts).hasSize(3);
assertThat(sqlContacts)
.comparingElementsUsing(immutableObjectCorrespondence("revisions", "updateTimestamp"))
.containsExactlyElementsIn(
contacts.stream()
.map(InitSqlTestUtils::datastoreToOfyEntity)
.map(ImmutableObject.class::cast)
.collect(ImmutableList.toImmutableList()));
}
}

View File

@@ -29,14 +29,11 @@ import org.apache.avro.Schema;
import org.apache.avro.generic.GenericData;
import org.apache.avro.generic.GenericRecord;
import org.apache.beam.sdk.io.gcp.bigquery.SchemaAndRecord;
import org.junit.Before;
import org.junit.Test;
import org.junit.runner.RunWith;
import org.junit.runners.JUnit4;
import org.junit.jupiter.api.BeforeEach;
import org.junit.jupiter.api.Test;
/** Unit tests for {@link BillingEvent} */
@RunWith(JUnit4.class)
public class BillingEventTest {
class BillingEventTest {
private static final String BILLING_EVENT_SCHEMA =
"{\"name\": \"BillingEvent\", "
@@ -60,8 +57,8 @@ public class BillingEventTest {
private SchemaAndRecord schemaAndRecord;
@Before
public void initializeRecord() {
@BeforeEach
void beforeEach() {
// Create a record with a given JSON schema.
schemaAndRecord = new SchemaAndRecord(createRecord(), null);
}
@@ -86,7 +83,7 @@ public class BillingEventTest {
}
@Test
public void testParseBillingEventFromRecord_success() {
void testParseBillingEventFromRecord_success() {
BillingEvent event = BillingEvent.parseFromRecord(schemaAndRecord);
assertThat(event.id()).isEqualTo(1);
assertThat(event.billingTime())
@@ -107,7 +104,7 @@ public class BillingEventTest {
}
@Test
public void testParseBillingEventFromRecord_sunriseCreate_reducedPrice_success() {
void testParseBillingEventFromRecord_sunriseCreate_reducedPrice_success() {
schemaAndRecord.getRecord().put("flags", "SUNRISE");
BillingEvent event = BillingEvent.parseFromRecord(schemaAndRecord);
assertThat(event.amount()).isEqualTo(17.43);
@@ -115,7 +112,7 @@ public class BillingEventTest {
}
@Test
public void testParseBillingEventFromRecord_anchorTenant_zeroPrice_success() {
void testParseBillingEventFromRecord_anchorTenant_zeroPrice_success() {
schemaAndRecord.getRecord().put("flags", "SUNRISE ANCHOR_TENANT");
BillingEvent event = BillingEvent.parseFromRecord(schemaAndRecord);
assertThat(event.amount()).isZero();
@@ -123,13 +120,13 @@ public class BillingEventTest {
}
@Test
public void testParseBillingEventFromRecord_nullValue_throwsException() {
void testParseBillingEventFromRecord_nullValue_throwsException() {
schemaAndRecord.getRecord().put("tld", null);
assertThrows(IllegalStateException.class, () -> BillingEvent.parseFromRecord(schemaAndRecord));
}
@Test
public void testConvertBillingEvent_toCsv() {
void testConvertBillingEvent_toCsv() {
BillingEvent event = BillingEvent.parseFromRecord(schemaAndRecord);
assertThat(event.toCsv())
.isEqualTo(
@@ -138,7 +135,7 @@ public class BillingEventTest {
}
@Test
public void testConvertBillingEvent_nonNullPoNumber_toCsv() {
void testConvertBillingEvent_nonNullPoNumber_toCsv() {
GenericRecord record = createRecord();
record.put("poNumber", "905610");
BillingEvent event = BillingEvent.parseFromRecord(new SchemaAndRecord(record, null));
@@ -149,13 +146,13 @@ public class BillingEventTest {
}
@Test
public void testGenerateBillingEventFilename() {
void testGenerateBillingEventFilename() {
BillingEvent event = BillingEvent.parseFromRecord(schemaAndRecord);
assertThat(event.toFilename("2017-10")).isEqualTo("invoice_details_2017-10_myRegistrar_test");
}
@Test
public void testGetInvoiceGroupingKey_fromBillingEvent() {
void testGetInvoiceGroupingKey_fromBillingEvent() {
BillingEvent event = BillingEvent.parseFromRecord(schemaAndRecord);
InvoiceGroupingKey invoiceKey = event.getInvoiceGroupingKey();
assertThat(invoiceKey.startDate()).isEqualTo("2017-10-01");
@@ -169,7 +166,7 @@ public class BillingEventTest {
}
@Test
public void test_nonNullPoNumber() {
void test_nonNullPoNumber() {
GenericRecord record = createRecord();
record.put("poNumber", "905610");
BillingEvent event = BillingEvent.parseFromRecord(new SchemaAndRecord(record, null));
@@ -179,7 +176,7 @@ public class BillingEventTest {
}
@Test
public void testConvertInvoiceGroupingKey_toCsv() {
void testConvertInvoiceGroupingKey_toCsv() {
BillingEvent event = BillingEvent.parseFromRecord(schemaAndRecord);
InvoiceGroupingKey invoiceKey = event.getInvoiceGroupingKey();
assertThat(invoiceKey.toCsv(3L))
@@ -189,7 +186,7 @@ public class BillingEventTest {
}
@Test
public void testInvoiceGroupingKeyCoder_deterministicSerialization() throws IOException {
void testInvoiceGroupingKeyCoder_deterministicSerialization() throws IOException {
InvoiceGroupingKey invoiceKey =
BillingEvent.parseFromRecord(schemaAndRecord).getInvoiceGroupingKey();
InvoiceGroupingKeyCoder coder = new InvoiceGroupingKeyCoder();
@@ -200,7 +197,7 @@ public class BillingEventTest {
}
@Test
public void testGetDetailReportHeader() {
void testGetDetailReportHeader() {
assertThat(BillingEvent.getHeader())
.isEqualTo(
"id,billingTime,eventTime,registrarId,billingId,poNumber,tld,action,"
@@ -208,7 +205,7 @@ public class BillingEventTest {
}
@Test
public void testGetOverallInvoiceHeader() {
void testGetOverallInvoiceHeader() {
assertThat(InvoiceGroupingKey.invoiceHeader())
.isEqualTo("StartDate,EndDate,ProductAccountKey,Amount,AmountCurrency,BillingProductCode,"
+ "SalesChannel,LineItemType,UsageGroupingKey,Quantity,Description,UnitPrice,"

View File

@@ -19,10 +19,13 @@ import static com.google.common.truth.Truth.assertThat;
import com.google.auth.oauth2.GoogleCredentials;
import com.google.common.collect.ImmutableList;
import com.google.common.collect.ImmutableMap;
import google.registry.beam.TestPipelineExtension;
import google.registry.util.GoogleCredentialsBundle;
import google.registry.util.ResourceUtils;
import java.io.File;
import java.io.IOException;
import java.nio.file.Files;
import java.nio.file.Path;
import java.time.ZoneId;
import java.time.ZonedDateTime;
import java.util.Map.Entry;
@@ -30,47 +33,48 @@ import org.apache.beam.runners.direct.DirectRunner;
import org.apache.beam.sdk.options.PipelineOptions;
import org.apache.beam.sdk.options.PipelineOptionsFactory;
import org.apache.beam.sdk.options.ValueProvider.StaticValueProvider;
import org.apache.beam.sdk.testing.TestPipeline;
import org.apache.beam.sdk.transforms.Create;
import org.apache.beam.sdk.values.PCollection;
import org.junit.Before;
import org.junit.BeforeClass;
import org.junit.Rule;
import org.junit.Test;
import org.junit.rules.TemporaryFolder;
import org.junit.runner.RunWith;
import org.junit.runners.JUnit4;
import org.junit.jupiter.api.BeforeAll;
import org.junit.jupiter.api.BeforeEach;
import org.junit.jupiter.api.Test;
import org.junit.jupiter.api.extension.RegisterExtension;
import org.junit.jupiter.api.io.TempDir;
/** Unit tests for {@link InvoicingPipeline}. */
@RunWith(JUnit4.class)
public class InvoicingPipelineTest {
class InvoicingPipelineTest {
private static PipelineOptions pipelineOptions;
@BeforeClass
public static void initializePipelineOptions() {
@BeforeAll
static void beforeAll() {
pipelineOptions = PipelineOptionsFactory.create();
pipelineOptions.setRunner(DirectRunner.class);
}
@Rule public final transient TestPipeline p = TestPipeline.fromOptions(pipelineOptions);
@Rule public final TemporaryFolder tempFolder = new TemporaryFolder();
@RegisterExtension
final transient TestPipelineExtension testPipeline =
TestPipelineExtension.fromOptions(pipelineOptions);
@SuppressWarnings("WeakerAccess")
@TempDir
transient Path tmpDir;
private InvoicingPipeline invoicingPipeline;
@Before
public void initializePipeline() throws IOException {
File beamTempFolder = tempFolder.newFolder();
String beamTempFolderPath = beamTempFolder.getAbsolutePath();
invoicingPipeline = new InvoicingPipeline(
"test-project",
beamTempFolderPath,
beamTempFolderPath + "/templates/invoicing",
beamTempFolderPath + "/staging",
tempFolder.getRoot().getAbsolutePath(),
"REG-INV",
GoogleCredentialsBundle.create(GoogleCredentials.create(null))
);
@BeforeEach
void beforeEach() throws IOException {
String beamTempFolder =
Files.createDirectory(tmpDir.resolve("beam_temp")).toAbsolutePath().toString();
invoicingPipeline =
new InvoicingPipeline(
"test-project",
beamTempFolder,
beamTempFolder + "/templates/invoicing",
beamTempFolder + "/staging",
tmpDir.toAbsolutePath().toString(),
"REG-INV",
GoogleCredentialsBundle.create(GoogleCredentials.create(null)));
}
private ImmutableList<BillingEvent> getInputEvents() {
@@ -186,17 +190,18 @@ public class InvoicingPipelineTest {
}
@Test
public void testEndToEndPipeline_generatesExpectedFiles() throws Exception {
void testEndToEndPipeline_generatesExpectedFiles() throws Exception {
ImmutableList<BillingEvent> inputRows = getInputEvents();
PCollection<BillingEvent> input = p.apply(Create.of(inputRows));
PCollection<BillingEvent> input = testPipeline.apply(Create.of(inputRows));
invoicingPipeline.applyTerminalTransforms(input, StaticValueProvider.of("2017-10"));
p.run();
testPipeline.run();
for (Entry<String, ImmutableList<String>> entry : getExpectedDetailReportMap().entrySet()) {
ImmutableList<String> detailReport = resultFileContents(entry.getKey());
assertThat(detailReport.get(0))
.isEqualTo("id,billingTime,eventTime,registrarId,billingId,poNumber,tld,action,"
+ "domain,repositoryId,years,currency,amount,flags");
.isEqualTo(
"id,billingTime,eventTime,registrarId,billingId,poNumber,tld,action,"
+ "domain,repositoryId,years,currency,amount,flags");
assertThat(detailReport.subList(1, detailReport.size()))
.containsExactlyElementsIn(entry.getValue());
}
@@ -215,8 +220,7 @@ public class InvoicingPipelineTest {
private ImmutableList<String> resultFileContents(String filename) throws Exception {
File resultFile =
new File(
String.format(
"%s/invoices/2017-10/%s", tempFolder.getRoot().getAbsolutePath(), filename));
String.format("%s/invoices/2017-10/%s", tmpDir.toAbsolutePath().toString(), filename));
return ImmutableList.copyOf(
ResourceUtils.readResourceUtf8(resultFile.toURI().toURL()).split("\n"));
}

View File

@@ -25,16 +25,13 @@ import org.apache.beam.sdk.io.FileBasedSink;
import org.apache.beam.sdk.options.ValueProvider;
import org.apache.beam.sdk.options.ValueProvider.StaticValueProvider;
import org.apache.beam.sdk.transforms.SerializableFunction;
import org.junit.Test;
import org.junit.runner.RunWith;
import org.junit.runners.JUnit4;
import org.junit.jupiter.api.Test;
/** Unit tests for {@link InvoicingUtils}. */
@RunWith(JUnit4.class)
public class InvoicingUtilsTest {
class InvoicingUtilsTest {
@Test
public void testDestinationFunction_generatesProperFileParams() {
void testDestinationFunction_generatesProperFileParams() {
SerializableFunction<BillingEvent, Params> destinationFunction =
InvoicingUtils.makeDestinationFunction("my/directory", StaticValueProvider.of("2017-10"));
@@ -53,7 +50,7 @@ public class InvoicingUtilsTest {
}
@Test
public void testEmptyDestinationParams() {
void testEmptyDestinationParams() {
assertThat(InvoicingUtils.makeEmptyDestinationParams("my/directory"))
.isEqualTo(
new Params()
@@ -63,7 +60,7 @@ public class InvoicingUtilsTest {
/** Asserts that the instantiated sql template matches a golden expected file. */
@Test
public void testMakeQueryProvider() {
void testMakeQueryProvider() {
ValueProvider<String> queryProvider =
InvoicingUtils.makeQueryProvider(StaticValueProvider.of("2017-10"), "my-project-id");
assertThat(queryProvider.get()).isEqualTo(loadFile("billing_events_test.sql"));

View File

@@ -24,6 +24,7 @@ import static org.mockito.Mockito.withSettings;
import com.google.auth.oauth2.GoogleCredentials;
import com.google.common.collect.ImmutableList;
import com.google.common.io.CharStreams;
import google.registry.beam.TestPipelineExtension;
import google.registry.beam.spec11.SafeBrowsingTransforms.EvaluateSafeBrowsingFn;
import google.registry.testing.FakeClock;
import google.registry.testing.FakeSleeper;
@@ -37,13 +38,14 @@ import java.io.InputStreamReader;
import java.io.ObjectInputStream;
import java.io.ObjectOutputStream;
import java.io.Serializable;
import java.nio.file.Files;
import java.nio.file.Path;
import java.util.Comparator;
import java.util.function.Supplier;
import org.apache.beam.runners.direct.DirectRunner;
import org.apache.beam.sdk.options.PipelineOptions;
import org.apache.beam.sdk.options.PipelineOptionsFactory;
import org.apache.beam.sdk.options.ValueProvider.StaticValueProvider;
import org.apache.beam.sdk.testing.TestPipeline;
import org.apache.beam.sdk.transforms.Create;
import org.apache.beam.sdk.values.PCollection;
import org.apache.http.ProtocolVersion;
@@ -56,44 +58,47 @@ import org.joda.time.DateTime;
import org.json.JSONArray;
import org.json.JSONException;
import org.json.JSONObject;
import org.junit.Before;
import org.junit.BeforeClass;
import org.junit.Rule;
import org.junit.Test;
import org.junit.rules.TemporaryFolder;
import org.junit.runner.RunWith;
import org.junit.runners.JUnit4;
import org.junit.jupiter.api.BeforeAll;
import org.junit.jupiter.api.BeforeEach;
import org.junit.jupiter.api.Test;
import org.junit.jupiter.api.extension.RegisterExtension;
import org.junit.jupiter.api.io.TempDir;
import org.mockito.invocation.InvocationOnMock;
import org.mockito.stubbing.Answer;
/** Unit tests for {@link Spec11Pipeline}. */
@RunWith(JUnit4.class)
public class Spec11PipelineTest {
class Spec11PipelineTest {
private static PipelineOptions pipelineOptions;
@BeforeClass
public static void initializePipelineOptions() {
@BeforeAll
static void beforeAll() {
pipelineOptions = PipelineOptionsFactory.create();
pipelineOptions.setRunner(DirectRunner.class);
}
@Rule public final transient TestPipeline p = TestPipeline.fromOptions(pipelineOptions);
@Rule public final TemporaryFolder tempFolder = new TemporaryFolder();
@RegisterExtension
final transient TestPipelineExtension testPipeline =
TestPipelineExtension.fromOptions(pipelineOptions);
@SuppressWarnings("WeakerAccess")
@TempDir
Path tmpDir;
private final Retrier retrier =
new Retrier(new FakeSleeper(new FakeClock(DateTime.parse("2019-07-15TZ"))), 1);
private Spec11Pipeline spec11Pipeline;
@Before
public void initializePipeline() throws IOException {
File beamTempFolder = tempFolder.newFolder();
@BeforeEach
void beforeEach() throws IOException {
String beamTempFolder =
Files.createDirectory(tmpDir.resolve("beam_temp")).toAbsolutePath().toString();
spec11Pipeline =
new Spec11Pipeline(
"test-project",
beamTempFolder.getAbsolutePath() + "/staging",
beamTempFolder.getAbsolutePath() + "/templates/invoicing",
tempFolder.getRoot().getAbsolutePath(),
beamTempFolder + "/staging",
beamTempFolder + "/templates/invoicing",
tmpDir.toAbsolutePath().toString(),
GoogleCredentialsBundle.create(GoogleCredentials.create(null)),
retrier);
}
@@ -127,7 +132,7 @@ public class Spec11PipelineTest {
*/
@Test
@SuppressWarnings("unchecked")
public void testEndToEndPipeline_generatesExpectedFiles() throws Exception {
void testEndToEndPipeline_generatesExpectedFiles() throws Exception {
// Establish mocks for testing
ImmutableList<Subdomain> inputRows = getInputDomains();
CloseableHttpClient httpClient = mock(CloseableHttpClient.class, withSettings().serializable());
@@ -142,9 +147,9 @@ public class Spec11PipelineTest {
(Serializable & Supplier) () -> httpClient);
// Apply input and evaluation transforms
PCollection<Subdomain> input = p.apply(Create.of(inputRows));
PCollection<Subdomain> input = testPipeline.apply(Create.of(inputRows));
spec11Pipeline.evaluateUrlHealth(input, evalFn, StaticValueProvider.of("2018-06-01"));
p.run();
testPipeline.run();
// Verify header and 4 threat matches for 3 registrars are found
ImmutableList<String> generatedReport = resultFileContents();
@@ -292,7 +297,7 @@ public class Spec11PipelineTest {
new File(
String.format(
"%s/icann/spec11/2018-06/SPEC11_MONTHLY_REPORT_2018-06-01",
tempFolder.getRoot().getAbsolutePath()));
tmpDir.toAbsolutePath().toString()));
return ImmutableList.copyOf(
ResourceUtils.readResourceUtf8(resultFile.toURI().toURL()).split("\n"));
}

View File

@@ -27,20 +27,18 @@ import com.google.api.services.bigquery.model.JobReference;
import java.util.concurrent.TimeUnit;
import org.joda.time.DateTime;
import org.joda.time.DateTimeZone;
import org.junit.Test;
import org.junit.runner.RunWith;
import org.junit.runners.JUnit4;
import org.junit.jupiter.api.Test;
/** Unit tests for {@link BigqueryUtils}. */
@RunWith(JUnit4.class)
public class BigqueryUtilsTest {
class BigqueryUtilsTest {
private static final DateTime DATE_0 = DateTime.parse("2014-07-17T20:35:42Z");
private static final DateTime DATE_1 = DateTime.parse("2014-07-17T20:35:42.1Z");
private static final DateTime DATE_2 = DateTime.parse("2014-07-17T20:35:42.12Z");
private static final DateTime DATE_3 = DateTime.parse("2014-07-17T20:35:42.123Z");
@Test
public void test_toBigqueryTimestampString() {
void test_toBigqueryTimestampString() {
assertThat(toBigqueryTimestampString(START_OF_TIME)).isEqualTo("1970-01-01 00:00:00.000");
assertThat(toBigqueryTimestampString(DATE_0)).isEqualTo("2014-07-17 20:35:42.000");
assertThat(toBigqueryTimestampString(DATE_1)).isEqualTo("2014-07-17 20:35:42.100");
@@ -50,7 +48,7 @@ public class BigqueryUtilsTest {
}
@Test
public void test_toBigqueryTimestampString_convertsToUtc() {
void test_toBigqueryTimestampString_convertsToUtc() {
assertThat(toBigqueryTimestampString(START_OF_TIME.withZone(DateTimeZone.forOffsetHours(5))))
.isEqualTo("1970-01-01 00:00:00.000");
assertThat(toBigqueryTimestampString(DateTime.parse("1970-01-01T00:00:00-0500")))
@@ -58,13 +56,13 @@ public class BigqueryUtilsTest {
}
@Test
public void test_fromBigqueryTimestampString_startAndEndOfTime() {
void test_fromBigqueryTimestampString_startAndEndOfTime() {
assertThat(fromBigqueryTimestampString("1970-01-01 00:00:00 UTC")).isEqualTo(START_OF_TIME);
assertThat(fromBigqueryTimestampString("294247-01-10 04:00:54.775 UTC")).isEqualTo(END_OF_TIME);
}
@Test
public void test_fromBigqueryTimestampString_trailingZerosOkay() {
void test_fromBigqueryTimestampString_trailingZerosOkay() {
assertThat(fromBigqueryTimestampString("2014-07-17 20:35:42 UTC")).isEqualTo(DATE_0);
assertThat(fromBigqueryTimestampString("2014-07-17 20:35:42.0 UTC")).isEqualTo(DATE_0);
assertThat(fromBigqueryTimestampString("2014-07-17 20:35:42.00 UTC")).isEqualTo(DATE_0);
@@ -78,27 +76,27 @@ public class BigqueryUtilsTest {
}
@Test
public void testFailure_fromBigqueryTimestampString_nonUtcTimeZone() {
void testFailure_fromBigqueryTimestampString_nonUtcTimeZone() {
assertThrows(
IllegalArgumentException.class,
() -> fromBigqueryTimestampString("2014-01-01 01:01:01 +05:00"));
}
@Test
public void testFailure_fromBigqueryTimestampString_noTimeZone() {
void testFailure_fromBigqueryTimestampString_noTimeZone() {
assertThrows(
IllegalArgumentException.class, () -> fromBigqueryTimestampString("2014-01-01 01:01:01"));
}
@Test
public void testFailure_fromBigqueryTimestampString_tooManyMillisecondDigits() {
void testFailure_fromBigqueryTimestampString_tooManyMillisecondDigits() {
assertThrows(
IllegalArgumentException.class,
() -> fromBigqueryTimestampString("2014-01-01 01:01:01.1234 UTC"));
}
@Test
public void test_toBigqueryTimestamp_timeunitConversion() {
void test_toBigqueryTimestamp_timeunitConversion() {
assertThat(toBigqueryTimestamp(1234567890L, TimeUnit.SECONDS))
.isEqualTo("1234567890.000000");
assertThat(toBigqueryTimestamp(1234567890123L, TimeUnit.MILLISECONDS))
@@ -110,14 +108,14 @@ public class BigqueryUtilsTest {
}
@Test
public void test_toBigqueryTimestamp_timeunitConversionForZero() {
void test_toBigqueryTimestamp_timeunitConversionForZero() {
assertThat(toBigqueryTimestamp(0L, TimeUnit.SECONDS)).isEqualTo("0.000000");
assertThat(toBigqueryTimestamp(0L, TimeUnit.MILLISECONDS)).isEqualTo("0.000000");
assertThat(toBigqueryTimestamp(0L, TimeUnit.MICROSECONDS)).isEqualTo("0.000000");
}
@Test
public void test_toBigqueryTimestamp_datetimeConversion() {
void test_toBigqueryTimestamp_datetimeConversion() {
assertThat(toBigqueryTimestamp(START_OF_TIME)).isEqualTo("0.000000");
assertThat(toBigqueryTimestamp(DATE_0)).isEqualTo("1405629342.000000");
assertThat(toBigqueryTimestamp(DATE_1)).isEqualTo("1405629342.100000");
@@ -127,18 +125,18 @@ public class BigqueryUtilsTest {
}
@Test
public void test_toJobReferenceString_normalSucceeds() {
void test_toJobReferenceString_normalSucceeds() {
assertThat(toJobReferenceString(new JobReference().setProjectId("foo").setJobId("bar")))
.isEqualTo("foo:bar");
}
@Test
public void test_toJobReferenceString_emptyReferenceSucceeds() {
void test_toJobReferenceString_emptyReferenceSucceeds() {
assertThat(toJobReferenceString(new JobReference())).isEqualTo("null:null");
}
@Test
public void test_toJobReferenceString_nullThrowsNpe() {
void test_toJobReferenceString_nullThrowsNpe() {
assertThrows(NullPointerException.class, () -> toJobReferenceString(null));
}
}

View File

@@ -29,15 +29,12 @@ import com.google.api.services.bigquery.model.TableFieldSchema;
import com.google.api.services.bigquery.model.TableReference;
import com.google.common.collect.ImmutableList;
import com.google.common.collect.ImmutableMap;
import org.junit.Before;
import org.junit.Test;
import org.junit.runner.RunWith;
import org.junit.runners.JUnit4;
import org.junit.jupiter.api.BeforeEach;
import org.junit.jupiter.api.Test;
import org.mockito.ArgumentCaptor;
/** Unit tests for {@link CheckedBigquery}. */
@RunWith(JUnit4.class)
public class CheckedBigqueryTest {
class CheckedBigqueryTest {
private final Bigquery bigquery = mock(Bigquery.class);
private final Bigquery.Datasets bigqueryDatasets = mock(Bigquery.Datasets.class);
@@ -48,8 +45,8 @@ public class CheckedBigqueryTest {
private CheckedBigquery checkedBigquery;
@Before
public void before() throws Exception {
@BeforeEach
void beforeEach() throws Exception {
when(bigquery.datasets()).thenReturn(bigqueryDatasets);
when(bigqueryDatasets.insert(eq("Project-Id"), any(Dataset.class)))
.thenReturn(bigqueryDatasetsInsert);
@@ -70,7 +67,7 @@ public class CheckedBigqueryTest {
}
@Test
public void testSuccess_datastoreCreation() throws Exception {
void testSuccess_datastoreCreation() throws Exception {
checkedBigquery.ensureDataSetExists("Project-Id", "Dataset-Id");
ArgumentCaptor<Dataset> datasetArg = ArgumentCaptor.forClass(Dataset.class);
@@ -83,7 +80,7 @@ public class CheckedBigqueryTest {
}
@Test
public void testSuccess_datastoreAndTableCreation() throws Exception {
void testSuccess_datastoreAndTableCreation() throws Exception {
checkedBigquery.ensureDataSetAndTableExist("Project-Id", "Dataset2", "Table2");
ArgumentCaptor<Dataset> datasetArg = ArgumentCaptor.forClass(Dataset.class);

View File

@@ -18,15 +18,13 @@ import static com.google.common.truth.Truth.assertThat;
import static google.registry.config.RegistryConfig.CONFIG_SETTINGS;
import static google.registry.config.RegistryConfig.ConfigModule.provideReservedTermsExportDisclaimer;
import org.junit.Test;
import org.junit.runner.RunWith;
import org.junit.runners.JUnit4;
import org.junit.jupiter.api.Test;
@RunWith(JUnit4.class)
public class RegistryConfigTest {
/** Unit tests for {@link RegistryConfig}. */
class RegistryConfigTest {
@Test
public void test_reservedTermsExportDisclaimer_isPrependedWithOctothorpes() {
void test_reservedTermsExportDisclaimer_isPrependedWithOctothorpes() {
assertThat(provideReservedTermsExportDisclaimer(CONFIG_SETTINGS.get()))
.isEqualTo("# Disclaimer line 1.\n" + "# Line 2 is this 1.");
}

View File

@@ -14,16 +14,13 @@
package google.registry.config;
import org.junit.Test;
import org.junit.runner.RunWith;
import org.junit.runners.JUnit4;
import org.junit.jupiter.api.Test;
/** Unit tests for {@link RegistryEnvironment}. */
@RunWith(JUnit4.class)
public class RegistryEnvironmentTest {
class RegistryEnvironmentTest {
@Test
public void testGet() {
void testGet() {
RegistryEnvironment.get();
}
}

View File

@@ -19,28 +19,25 @@ import static google.registry.testing.TaskQueueHelper.assertTasksEnqueued;
import com.google.common.base.Joiner;
import google.registry.model.ofy.CommitLogBucket;
import google.registry.testing.AppEngineRule;
import google.registry.testing.AppEngineExtension;
import google.registry.testing.TaskQueueHelper.TaskMatcher;
import google.registry.util.Retrier;
import google.registry.util.TaskQueueUtils;
import java.util.ArrayList;
import java.util.List;
import java.util.Optional;
import org.junit.Rule;
import org.junit.Test;
import org.junit.runner.RunWith;
import org.junit.runners.JUnit4;
import org.junit.jupiter.api.Test;
import org.junit.jupiter.api.extension.RegisterExtension;
/** Unit tests for {@link CommitLogFanoutAction}. */
@RunWith(JUnit4.class)
public class CommitLogFanoutActionTest {
class CommitLogFanoutActionTest {
private static final String ENDPOINT = "/the/servlet";
private static final String QUEUE = "the-queue";
@Rule
public final AppEngineRule appEngine =
AppEngineRule.builder()
@RegisterExtension
final AppEngineExtension appEngineRule =
AppEngineExtension.builder()
.withDatastoreAndCloudSql()
.withTaskQueue(
Joiner.on('\n')
@@ -55,7 +52,7 @@ public class CommitLogFanoutActionTest {
.build();
@Test
public void testSuccess() {
void testSuccess() {
CommitLogFanoutAction action = new CommitLogFanoutAction();
action.taskQueueUtils = new TaskQueueUtils(new Retrier(null, 1));
action.endpoint = ENDPOINT;

Some files were not shown because too many files have changed in this diff Show More