Text from the page:
I'm going to change my server side code and it's going to be less complext thanks to Hasura. #buildinpublic #opensource #serverless
Text from the page:
production and how Hasura helps you through the process seamlessly with it's advanced security and production ready features. Praveen
Text from the page:
Create a subgraph
Have a
MotherDuck
hosted DuckDB database, or a persitent DuckDB database file — for
supplying data to your API.
The
Text from the page:
Yes
N/A
HASURA_BIGQUERY_PROJECT_ID
The project ID of the BigQuery databse project
Yes
N/A
HASURA_BIGQUERY_DATASET_ID
The dataset ID of
Text from the page:
Yes
N/A
HASURA_BIGQUERY_DATASET_ID
The dataset ID of the BigQuery databse project
Yes
N/A
After the CLI initializes the connector, you
Text from the page:
Procedures
✅
Custom Fields
❌
Mutations
❌
Only native mutations are suppported
Distinct
✅
Enums
❌
Naming Conventions
❌
Default Values
Text from the page:
that it is changed in the migrations directory as well to make sure the the CLI commands continue to work for applying the migrations
Text from the page:
beyond that. Arguably, and much to the frustration of true GraphQL stans, GraphQL fragments and Relay style GraphQL are not as popular
Text from the page:
the boilerplate code from the framework you choose, all from your commandline inside your Hasura project.
Additional Reading:
Creating
Text from the page:
I have found that it can get quite confusing setting individual permisions for every field and worry about the security implication
Text from the page:
Data Wrappers, inspired by Postgres's FDW Foreign Data Wrappers ofcourse) that will allow people at Hasura, data-source vendors, and
Text from the page:
velocity post launch - they did not want to have to build new APIs everytime they wanted to expose new data to the frontend.
GraphQL
Text from the page:
in GraphQL is necessary to define and model the costs and benefits of of adopting GraphQL in your organization: #1. Back to the backend
Text from the page:
Our early work with Hasura investigated why there was
a large discrepency between the memory usage reported by the operating system
Text from the page:
issue with cost centre profiling has been that adding cost centres severly affects how your program is optimised. This means that the
Text from the page:
a good six months to build out all the features that Hasura comes with out of the box” A notable side benefit is that computational
Text from the page:
with it. So a for a given bucket can access only that bucket but a someone with a global role can view any bucket. Both global roles
Text from the page:
default, these are two separate Dataloader caches resulting in two seperate requests for the same data. To fix this, you must
manually
Text from the page:
IO), apply authorization checks for each element in the response (involes IO and compute) and then send the result back to the client
Text from the page:
the third check because we need to create our tag count check in a round-about way to get around a minor bug in the platform. We will
Text from the page:
production and how Hasura helps you through the process seamlessly with it's advanced security and production ready features. Praveen
Text from the page:
A
is generated on every subscription operation sent by the client. Co-incidentally, a
is
also generated.
A
is generated per client, when
Text from the page:
plugins are not executed.
Let's take an example where the engine is confiured with two pre-parse plugins:
and
.
In this example, the
Text from the page:
process. A schema file is a JSON file that contains the results of an an introspection query. Conventionally this file is called To
Text from the page:
with Query Parmeters (as key-value pairs)
Request URL with Query Parmeters (as raw string)
Request URL with Query Parmeters (as key
Text from the page:
URL with Query Parmeters (as raw string)
Request URL with Query Parmeters (as key-value pairs)
You can provide
type query params to
Text from the page:
supergraphs and subgraphs on Hasura DDN.
Flags vs Context
We are explicity using flags here to show what arguments are set, some of
Text from the page:
collection argument names.
Example:
ArgumentMapping
Mapping of a comand or model argument name to the corresponding argument name used
Text from the page:
procedure argument names.
Example:
ArgumentMapping
Mapping of a comand or model argument name to the corresponding argument name used
Text from the page:
fundamentals of SQL database and learn the core concepts. Some of the them include Schema Tables Data Types and Columns Primary Keys
Text from the page:
easier, perform custom logic using existing data. Views execute everytime the query is made. Materialized Views are similar to views
Text from the page:
As your metadata has changed, create a new build:
Bring down the servies by pressing CTRL+C and start them back up:
Step 18. Insert
Text from the page:
queries and mutations in one file named
Create a new file at the roo
and add the following code:
Todo/api.graphql
We have now written
Text from the page:
which was used to populate sample data.
Add this function to run in hte
Woot! You have written your first GraphQL integration with iOS
Text from the page:
after first load.
We should update the Apollo cache with the todos recevied from fetchMore requests by implementing updateQuery method
Text from the page:
compare previous id with current id via subscription, if new id is greater then previous id, it will increase the count of notification
Text from the page:
of the file: TaskFragment.kt We are making an optimistic update to the the and then doing the actual mutation over the cloud. Lets add
Text from the page:
and update cache Edit on GitHub Mutation and update cache Open and and add the following code to define the bulk delete mutation src
Text from the page:
can be provided via the following variables, which can either be suppied via Hasura Cloud, or as
environment variable arguments to the