Merge remote-tracking branch 'origin/main' into rh/2406-clipboard

This commit is contained in:
CocoByte 2024-07-31 17:26:48 -06:00
commit e0ea11d909
No known key found for this signature in database
GPG key ID: BBFAA2526384C97F
3 changed files with 141 additions and 40 deletions

View file

@ -15,12 +15,28 @@ assignees: abroddrick
## Installation ## Installation
There are several tools we use locally that you will need to have. There are several tools we use locally that you will need to have.
- [ ] [Install the cf CLI v7](https://docs.cloudfoundry.org/cf-cli/install-go-cli.html#pkg-mac) for the ability to deploy
- [ ] [Cloudfoundry CLI](https://docs.cloudfoundry.org/cf-cli/install-go-cli.html#pkg-mac) Note: If you are on Windows the cli will be under `cf8` or `cf7` depending on which version you install.
- If you are using Windows, installation information can be found [here](https://github.com/cloudfoundry/cli/wiki/V8-CLI-Installation-Guide#installers-and-compressed-binaries) - If you are using Windows, installation information can be found [here](https://github.com/cloudfoundry/cli/wiki/V8-CLI-Installation-Guide#installers-and-compressed-binaries)
- Alternatively, for Windows, [consider using chocolately](https://community.chocolatey.org/packages/cloudfoundry-cli/7.2.0) - Alternatively, for Windows, [consider using chocolately](https://community.chocolatey.org/packages/cloudfoundry-cli/7.2.0)
- [ ] Make sure you have `gpg` >2.1.7. Run `gpg --version` to check. If not, [install gnupg](https://formulae.brew.sh/formula/gnupg) - [ ] [GPG](https://gnupg.org/download/)
- Alternatively, you can skip this step and [use ssh keys](#setting-up-commit-signing-with-ssh) instead - Make sure you have `gpg` >2.1.7. Run `gpg --version` to check. If not, [install gnupg](https://formulae.brew.sh/formula/gnupg)
- [ ] Install the [Github CLI](https://cli.github.com/) - This may not work on DHS devices. Alternatively, you can [use ssh keys](#setting-up-commit-signing-with-ssh) instead.
- [ ] Docker Community Edition*
- [ ] Git*
- [ ] VSCode (our preferred editor)*
- [ ] Github Desktop* or the Github CLI*
The following tools are optional but recommended. For DHS devices, these can be requested through the DHS IT portal:
- [ ] Slack Desktop App**
- [ ] Python 3.10*
- [ ] NodeJS (latest version available)*
- [ ] Putty*
- [ ] Windows Subsystem for Linux*
* Must be requested through DHS IT portal on DHS devices
** Downloadable via DHS Software Center
## Access ## Access
@ -37,7 +53,12 @@ cf login -a api.fr.cloud.gov --sso
**Note:** As mentioned in the [Login documentation](https://developers.login.gov/testing/), the sandbox Login account is different account from your regular, production Login account. If you have not created a Login account for the sandbox before, you will need to create a new account first. **Note:** As mentioned in the [Login documentation](https://developers.login.gov/testing/), the sandbox Login account is different account from your regular, production Login account. If you have not created a Login account for the sandbox before, you will need to create a new account first.
- [ ] Optional- add yourself as a codeowner if desired. See the [Developer readme](https://github.com/cisagov/getgov/blob/main/docs/developer/README.md) for how to do this and what it does. Follow the [.gov onboarding dev setup instructions](https://docs.google.com/document/d/1ukbpW4LSqkb_CCt8LWfpehP03qqfyYfvK3Fl21NaEq8/edit#heading=h.94jwfwkpkhdx). Confirm you successfully set up the following accounts:
- [ ] Identity sandbox accounts - 1 superuser access account and 1 analyst access account.
- [ ] Login.gov account to access stable
**Optional**
- [ ] Add yourself as a codeowner if desired. See the [Developer readme](https://github.com/cisagov/getgov/blob/main/docs/developer/README.md) for how to do this and what it does.
### Steps for the onboarder ### Steps for the onboarder
- [ ] Add the onboardee to cloud.gov org (cisa-dotgov) - [ ] Add the onboardee to cloud.gov org (cisa-dotgov)
@ -124,3 +145,19 @@ Additionally, consider a gpg key manager like Kleopatra if you run into issues w
We have three types of environments: stable, staging, and sandbox. Stable (production)and staging (pre-prod) get deployed via tagged release, and developer sandboxes are given to get.gov developers to mess around in a production-like environment without disrupting stable or staging. Each sandbox is namespaced and will automatically be deployed too when the appropriate branch syntax is used for that space in an open pull request. There are several things you need to setup to make the sandbox work for a developer. We have three types of environments: stable, staging, and sandbox. Stable (production)and staging (pre-prod) get deployed via tagged release, and developer sandboxes are given to get.gov developers to mess around in a production-like environment without disrupting stable or staging. Each sandbox is namespaced and will automatically be deployed too when the appropriate branch syntax is used for that space in an open pull request. There are several things you need to setup to make the sandbox work for a developer.
All automation for setting up a developer sandbox is documented in the scripts for [creating a developer sandbox](../../ops/scripts/create_dev_sandbox.sh) and [removing a developer sandbox](../../ops/scripts/destroy_dev_sandbox.sh). A Cloud.gov organization administrator will have to perform the script in order to create the sandbox. All automation for setting up a developer sandbox is documented in the scripts for [creating a developer sandbox](../../ops/scripts/create_dev_sandbox.sh) and [removing a developer sandbox](../../ops/scripts/destroy_dev_sandbox.sh). A Cloud.gov organization administrator will have to perform the script in order to create the sandbox.
## Known Issues
### SSL Verification Failure
Some developers using Government Furnished Equipment (GFE) have problems using tools such as git and pip due to SSL verification failurse. This happens because GFE has a custom certificate chain installed, but these tools use their own certificate bundles. As a result, when they try to verify an ssl connection, they cannot and so the connection fails. To resolve this in pip you can use --use-feature=truststore to direct pip to use the local certificate store. If you are running into this issue when using git on windows, run ```git config --global http.sslbackend schannel```.
If you are running into these issues in a docker container you will need to export the root certificate and pull it into the container. Ask another developer how to do this properly.
### Puppeteer Download Error
When building the node image either individually or with docker compose, there may be an error caused by a node package call puppeteer. This can be resolved by adding `ENV PUPPETEER_SKIP_DOWNLOAD=true` to [node.Dockerfile](../../src/node.Dockerfile) after the COPY command.
### Checksum Error
There is an unresolved issue with python package installation that occurs after the above SSL Verification failure has been resolved. It often manifests as a checksum error, where the hash of a download .whl file (python package) does not match the expected value. This appears to be because pythonhosted.org is cutting off download connections to some devices for some packages (the behavior is somewhat inconsistent). We have outstanding issues with PyPA and DHS IT to fix this. In the meantime we have a [workaround](#developing-using-docker).
## Developing Using Docker
While we have unresolved issues with certain devices, you can pull a pre-built docker image from matthewswspence/getgov-base that comes with all the needed packages installed. To do this, you will need to change the very first line in the main [Dockerfile](../../src/Dockerfile) to `FROM matthewswspence/getgov-base:latest`. Note: this change will need to be reverted before any branch can be merged. Additionally, this will only resolve the [checksum error](#checksum-error), you will still need to resolve any other issues through the listed instructions. We are actively working to resolve this inconvenience.

View file

@ -56,14 +56,27 @@ class Command(BaseCommand):
self.clean_table(table_name) self.clean_table(table_name)
def clean_table(self, table_name): def clean_table(self, table_name):
"""Delete all rows in the given table""" """Delete all rows in the given table.
Delete in batches to be able to handle large tables"""
try: try:
# Get the model class dynamically # Get the model class dynamically
model = apps.get_model("registrar", table_name) model = apps.get_model("registrar", table_name)
BATCH_SIZE = 1000
total_deleted = 0
# Get initial batch of primary keys
pks = list(model.objects.values_list("pk", flat=True)[:BATCH_SIZE])
while pks:
# Use a transaction to ensure database integrity # Use a transaction to ensure database integrity
with transaction.atomic(): with transaction.atomic():
model.objects.all().delete() deleted, _ = model.objects.filter(pk__in=pks).delete()
logger.info(f"Successfully cleaned table {table_name}") total_deleted += deleted
logger.debug(f"Deleted {deleted} {table_name}s, total deleted: {total_deleted}")
# Get the next batch of primary keys
pks = list(model.objects.values_list("pk", flat=True)[:BATCH_SIZE])
logger.info(f"Successfully cleaned table {table_name}, deleted {total_deleted} rows")
except LookupError: except LookupError:
logger.error(f"Model for table {table_name} not found.") logger.error(f"Model for table {table_name} not found.")
except Exception as e: except Exception as e:

View file

@ -810,7 +810,7 @@ class TestCleanTables(TestCase):
@override_settings(IS_PRODUCTION=False) @override_settings(IS_PRODUCTION=False)
def test_command_cleans_tables(self): def test_command_cleans_tables(self):
"""test that the handle method functions properly to clean tables""" """test that the handle method functions properly to clean tables"""
with less_console_noise():
with patch("django.apps.apps.get_model") as get_model_mock: with patch("django.apps.apps.get_model") as get_model_mock:
model_mock = MagicMock() model_mock = MagicMock()
get_model_mock.return_value = model_mock get_model_mock.return_value = model_mock
@ -819,27 +819,60 @@ class TestCleanTables(TestCase):
"registrar.management.commands.utility.terminal_helper.TerminalHelper.query_yes_no_exit", # noqa "registrar.management.commands.utility.terminal_helper.TerminalHelper.query_yes_no_exit", # noqa
return_value=True, return_value=True,
): ):
# List of pks to be returned in batches, one list for each of 11 tables
pk_batch = [1, 2, 3, 4, 5, 6]
# Create a list of batches with alternating non-empty and empty lists
pk_batches = [pk_batch, []] * 11
# Set the side effect of values_list to return different pk batches
# First time values_list is called it returns list of 6 objects to delete;
# Next time values_list is called it returns empty list
def values_list_side_effect(*args, **kwargs):
if args == ("pk",) and kwargs.get("flat", False):
return pk_batches.pop(0)
return []
model_mock.objects.values_list.side_effect = values_list_side_effect
# Mock the return value of `delete()` to be (6, ...)
model_mock.objects.filter.return_value.delete.return_value = (6, None)
call_command("clean_tables") call_command("clean_tables")
table_names = [ table_names = [
"DomainInformation", "DomainInformation",
"DomainRequest", "DomainRequest",
"FederalAgency",
"PublicContact", "PublicContact",
"HostIp",
"Host",
"Domain", "Domain",
"User", "User",
"Contact", "Contact",
"Website", "Website",
"DraftDomain", "DraftDomain",
"HostIp",
"Host",
] ]
# Check that each model's delete method was called expected_filter_calls = [call(pk__in=[1, 2, 3, 4, 5, 6]) for _ in range(11)]
actual_filter_calls = [c for c in model_mock.objects.filter.call_args_list if "pk__in" in c[1]]
try:
# Assert that filter(pk__in=...) was called with expected arguments
self.assertEqual(actual_filter_calls, expected_filter_calls)
# Check that delete() was called for each batch
for batch in [[1, 2, 3, 4, 5, 6]]:
model_mock.objects.filter(pk__in=batch).delete.assert_called()
for table_name in table_names: for table_name in table_names:
get_model_mock.assert_any_call("registrar", table_name) get_model_mock.assert_any_call("registrar", table_name)
model_mock.objects.all().delete.assert_called() self.logger_mock.info.assert_any_call(
f"Successfully cleaned table {table_name}, deleted 6 rows"
self.logger_mock.info.assert_any_call("Successfully cleaned table DomainInformation") )
except AssertionError as e:
print(f"AssertionError: {e}")
raise
@override_settings(IS_PRODUCTION=False) @override_settings(IS_PRODUCTION=False)
def test_command_handles_nonexistent_model(self): def test_command_handles_nonexistent_model(self):
@ -870,15 +903,33 @@ class TestCleanTables(TestCase):
with patch("django.apps.apps.get_model") as get_model_mock: with patch("django.apps.apps.get_model") as get_model_mock:
model_mock = MagicMock() model_mock = MagicMock()
get_model_mock.return_value = model_mock get_model_mock.return_value = model_mock
model_mock.objects.all().delete.side_effect = Exception("Some error")
# Mock the values_list so that DomainInformation attempts a delete
pk_batches = [[1, 2, 3, 4, 5, 6], []]
def values_list_side_effect(*args, **kwargs):
if args == ("pk",) and kwargs.get("flat", False):
return pk_batches.pop(0)
return []
model_mock.objects.values_list.side_effect = values_list_side_effect
# Mock delete to raise a generic exception
model_mock.objects.filter.return_value.delete.side_effect = Exception("Mocked delete exception")
with patch( with patch(
"registrar.management.commands.utility.terminal_helper.TerminalHelper.query_yes_no_exit", # noqa "registrar.management.commands.utility.terminal_helper.TerminalHelper.query_yes_no_exit",
return_value=True, return_value=True,
): ):
with self.assertRaises(Exception) as context:
# Execute the command
call_command("clean_tables") call_command("clean_tables")
self.logger_mock.error.assert_any_call("Error cleaning table DomainInformation: Some error") # Check the exception message
self.assertEqual(str(context.exception), "Custom delete error")
# Assert that delete was called
model_mock.objects.filter.return_value.delete.assert_called()
class TestExportTables(MockEppLib): class TestExportTables(MockEppLib):