Skip to main content

5 posts tagged with "javascript"

View All Tags

· One min read
Jeffrey Aven

In the stackql project we needed an API to serve configuration file packages (stackql providers) to the stackql application at runtime.

Traditional artifact repositories or package managers were unsuitable as they were mainly designed for container images, JavaScript modules, Python packages etc. The artifacts, in this case, are signed tarball sets of OpenAPI specification documents (text files).

We have recently moved our provider registry (stackql-provider-registry) to use Deno Deploy as the serving layer (the API).

The code

The code is reasonably straightforward as shown here:

The deployment

We are using GitHub Actions to push assets and code to Deno Deploy, this was straightforward as well as you can see here:

if you have enjoyed this post, please consider buying me a coffee ☕ to help me keep writing!

· 3 min read
Jeffrey Aven

Those who have built projects (front end or back end) with JavaScript or TypeScript would have no doubt felt pain or been frustrated with some aspect of package management - package.json, package-lock.json, node_modules, npm, npmjs, yarn etc.

Enter Deno, a "package manager-less" runtime for JavaScript and TypeScript. That's right, no package.json, no node_modules folder, no npm or yarn.

Background

Deno was created by Ryan Dahl, the creator of Node.js, who realised the monster that was created with package management and managers, dependencies, and dependency management, which in many cases is more complex than the frameworks or projects that are being implemented.

Packages

I said deno was "package manager-less"; however it is not "package-less". Deno does not use npmjs; instead, packages (js or ts) can be hosted at any reachable URL. Local imports and exports are supported too.

Deno's standard library packages ("batteries included" modules) are hosted at deno.land/std@LATEST_VERSION, a third-party hosted library is available at deno.land/x.

With deno installed, you can run something like this:

deno run https://deno.land/std@0.154.0/examples/welcome.ts

Imports

Using the Deno runtime, developers specify modules to use in their program using this import syntax:

import { Application, Router } from "https://deno.land/x/oak/mod.ts";

You can specify a version if desired in the URL, such as:

import {
Bson,
MongoClient,
} from "https://deno.land/x/mongo@v0.31.0/mod.ts";

Packages do not have to be downloaded or installed before running your code.

The first time your code is run, all packages are downloaded to a local cache and any Typescript modules are transpiled to JavaScript.

Publishing with Deno Deploy

Deno deploy is a package publishing framework that directly integrates with GitHub (no separate npm publish step). Deno deploy is backed by a CDN and a network of edge servers to make deno packages available. Packages can be published by a push to a branch, reviewed via a deployed preview, and merged to release to production.

Quickstart

To get going, you first need to download and install deno, which will vary based upon your operating system, but there are all the usual suspect installers available (homebrew, chocolatey, etc); see here.

Once you have installed deno on your system, you can create a project folder (no npm init or package.json required), and create the following file (as server.ts), which will run a very simple middleware server using a third-party module, oak.

import { Application, Router } from "https://deno.land/x/oak/mod.ts";

const router = new Router();
router
.get("/ping", (context) => {
context.response.body = "pong";
});

const app = new Application();
app.use(router.routes());
app.use(router.allowedMethods());

await app.listen({ port: 8080 });

now run your server using the following command:

deno run --allow-net server.ts

now:

curl -XGET http://localhost:8080/ping

should return:

pong

easy!

if you have enjoyed this post, please consider buying me a coffee ☕ to help me keep writing!

· 2 min read
Jeffrey Aven

I had a scenario where I needed to find values for a key in a complex JavaScript object which could be nested n levels deep.

I found numerous approaches to doing this, most were overly complicated, so I thought I would share the most straightforward, concise process.

the Code

You can do this in a straightforward function implementing the "tail call recursion" pattern to search for a key (key) from the root of an object (obj), excluding any keys in excludeKeys.

This will return a list of values for the given key, searching all levels in all branches of the object.

function getAllValuesForKey(obj, key, excludeKeys=[], values=[]) {
for (let k in obj) {
if (typeof obj[k] === "object") {
if(!excludeKeys.includes(k)){
getAllValuesForKey(obj[k], key, excludeKeys, values)
}
} else {
if (k === key){
values.push(obj[k]);
}
}
}
return values;
}

Example

In parsing an OpenAPI or Swagger specification, I am looking for all of the schema refs in a successful response body, for example:

paths:
'/orgs/{org}/actions/permissions/selected-actions':
get:
...
responses:
'200': '...'

however these refs can present in various different ways depending upon the response type, such as:

'200':
$ref: '#/components/responses/actions_runner_labels'

or

'200':      
content:
application/json:
schema:
$ref: '#/components/schemas/runner'

or

'200':
content:
application/json:
schema:
anyOf:
- $ref: '#/components/schemas/interaction-limit-response'

or

'200':
content:
application/json:
schema:
type: object
required:
- total_count
- runners
properties:
total_count:
type: integer
runners:
type: array
items:
$ref: '#/components/schemas/runner'

To find all of the schema refs without knowing the response type or structure I used the above function as follows (excluding refs for examples):

function getRespSchemaName(op){
for(let respCode in op.responses){
if(respCode.startsWith('2')){
return getAllValuesForKey(op.responses[respCode], "$ref", ['examples']);
}
}
}

You can find this implementation in openapi-doc-util and @stackql/openapi-doc-util.

simple!

if you have enjoyed this post, please consider buying me a coffee ☕ to help me keep writing!

· 5 min read
Jeffrey Aven

Background

AWS Lambda instances will return UTC/GMT time for any date time object created using the Date.now() function in JavaScript as shown here:

let now = new Date();
const tzOffset = now.getTimezoneOffset();
console.log(`Default Timezone Offset: ${tzOffset}`);
// results in ...
// Default Timezone Offset: 0

Moreover, Lambda instances are stateless and have no concept of local time. This can make dealing with dates more challenging.

This is compounded for localities which have legislated Daylight Savings Time during part of the year.

Solution

A simple (vanilla JavaScript - no third party libraries or external API calls) to adjust the time to local time adjusted for Daylight Savings Time is provided here:

function getGmtDstTransitionDate(year, month, transitionDay, hour){
const firstDayOfTheMonth = new Date(year, month, 1);
let transitionDate = new Date(firstDayOfTheMonth);
// find the first transition day of the month if the first day of the month is not a transition day
if (firstDayOfTheMonth.getDay() !== transitionDay) {
transitionDate = new Date(firstDayOfTheMonth.setDate(firstDayOfTheMonth.getDate() + (transitionDay - firstDayOfTheMonth.getDay())));
};
// return the transition date and time
return new Date(transitionDate.getTime() + (hour * 60 * 60000));
};

function getLocalDateTime(date) {
// default to GMT+11 for AEDT
let offsetInHours = 11;
// if month is between April and October check further, if not return AEDT offset
// remeber getMonth is zero based!
if (date.getMonth() >= 3 && date.getMonth() <= 9) {
// DST starts at 0200 on the First Sunday in October, which is 1600 (16) on the First Saturday (6) in October (9) GMT
const dstStartDate = getGmtDstTransitionDate(date.getFullYear(), 9, 6, 16);
// DST ends at 0300 on the First Sunday in April, which is 1600 (16) on the First Saturday (6) in April (3) GMT
const dstEndDate = getGmtDstTransitionDate(date.getFullYear(), 3, 6, 16);
if (date >= dstEndDate && date < dstStartDate) {
offsetInHours = 10;
};
};
// return the date and time in local time
return new Date(date.getTime() + (offsetInHours * 60 * 60000));
}

// get current timestamp
let now = new Date();
console.log(`UTC Date: ${now}`);
now = getLocalDateTime(now);
console.log(`Local toLocaleString: ${now.toLocaleString()}`);

Breaking it down

This solution is comprised of two functions for DRY purposes.

The main function getLocalDateTime takes a date object representing the current time in UTC and returns a date object representing the local (DST adjusted) time.

The getLocalDateTime function sets a default DST adjusted offset in hours (11 in the case of AEDT), if the month is between April and October the getGmtDstTransitionDate is used to determine the exact boundaries between Standard Time and Daylight Savings Time.

In the case of AEST/AEDT this is the first Sunday in October at 0200 to enter Daylight Savings Time and the first Sunday in April at 0300 to end Daylight Savings Time (both dates and times are adjusted to their equivalent GMT times) and return to Standard Time (10 hours in the cases of AEST).

The offsetInHours variable and the arguments for getGmtDstTransitionDate can be easily modified for other timezones.

Tests

Some simple tests to run to check if the code is working correctly, to help with this I have set up the following unit test function:

function unitTest(inputDate, expOutputDate, testCase) {
if (getLocalDateTime(inputDate).toUTCString() === expOutputDate.toUTCString()) {
console.log(`TEST PASSED ${testCase}`)
} else {
console.log(`TEST FAILED ${testCase} : input date in GMT ${inputDate} should equal ${expOutputDate}`)
};
};

first create dates representing the beginning of Daylight Savings Time (immediately before the beginning, at the beginning and immediately after the beginning):

unitTest(new Date(2022, 9, 1, 15, 59, 59, 999), new Date(2022, 9, 2, 1, 59, 59, 999), "one ms before dst start");
// returns...
// ... INFO TEST PASSED one ms before dst start
unitTest(new Date(2022, 9, 1, 16, 0, 0, 0), new Date(2022, 9, 2, 3, 0, 0, 0), "dst start");
// returns...
// ... INFO TEST PASSED dst start
unitTest(new Date(2022, 9, 1, 16, 0, 0, 1), new Date(2022, 9, 2, 3, 0, 0, 1), "one ms after dst start");
// returns...
// ... INFO TEST PASSED one ms after dst start

next create dates similar tests representing the end of Daylight Savings Time (or beginning of Standard Time):

unitTest(new Date(2022, 3, 2, 15, 59, 59, 999), new Date(2022, 3, 3, 2, 59, 59, 999), "one ms before dst end");
// returns...
// ... INFO TEST PASSED one ms before dst end
unitTest(new Date(2022, 3, 2, 16, 0, 0, 0), new Date(2022, 3, 3, 2, 0, 0, 0), "dst end");
// returns...
// ... INFO TEST PASSED dst end
unitTest(new Date(2022, 3, 2, 16, 0, 0, 1), new Date(2022, 3, 3, 2, 0, 0, 1), "one ms after dst end");
// returns...
// ... INFO TEST PASSED one ms after dst end

Enjoy

if you have enjoyed this post, please consider buying me a coffee ☕ to help me keep writing!

· 2 min read
Jeffrey Aven

Snowflake

Snowflake allows roles to be assigned to other roles, so when a user is assigned to a role, they may inherit the ability to use countless other roles.

Challenge: recursively enumerate all roles for a given user

One solution would be to create a complex query on the “SNOWFLAKE"."ACCOUNT_USAGE"."GRANTS_TO_ROLES" object.

An easier solution is to use a stored procedure to recurse through grants for a given user and return an ARRAY of roles for that user.

This is a good programming exercise in tail call recursion (sort of) in JavaScript. Here is the code:

To call the stored proc, execute:

One drawback of stored procedures in Snowflake is that they can only have scalar or array return types and cannot be used directly in a SQL query, however you can use the table(result_scan(last_query_id())) trick to get around this, as shown below where we will pivot the ARRAY into a record set with the array elements as rows:

IMPORTANT

This query must be the next statement run immediately after the CALL statement and cannot be run again until you run another CALL statement.

More adventures with Snowflake soon!

if you have enjoyed this post, please consider buying me a coffee ☕ to help me keep writing!