Error uploading Feed document with amazon-sp-api: Invalid initialization vector - javascript

I'm trying to upload a Feed document to Amazon using the createFeedDocument operation of the Selling Partner API. After the API call, I received a response object containing the feedDocumentId, url, and encryptionDetails, which includes the standard, initializationVector, and key. However, when I try to upload the Feed document using the returned URL, I get an error saying "Invalid initialization vector".
I am using version 0.7.10.
This is the response I get from the callAPI function:
{ feedDocumentId: '3d4e42b5-1d6e-44e8-a89c-XXXXXXX', url: 'https://d34o8swodXXX.cloudfront.net/Feed_101__POST_PRODUCT_DATA_%2BKEY%3DFeed_101%2BMode%3DCBC%2BINITVEC%3D8f+6c+cc+56+0d+50+a2+d0+31+ec+80+be+f2+6a+1d+0a', encryptionDetails: { standard: 'AES', initializationVector: '8f 6c cc 56 0d 50 a2 d0 31 ec 80 be f2 6a 1d 0a', key: 'key' } }
The issue I'm facing is with the createCipheriv function call within the SellingPartner.js file of the amazon-sp-api package. Specifically, the error is occurring when the code tries to create a cipher using the initialization vector returned by the Selling Partner API's createFeedDocument operation. The error message reads "TypeError: Invalid initialization vector". I am unable to check or modify the initialization vector as it is returned by the Selling Partner API. The goal of my code is to upload a feed to list a new product to the Amazon catalog, which does not yet exist. The feed object itself should not be related to the issue.
Can someone help me resolve this error? Thank you!

Related

Cardano-wallet malformed tx payload from Submit External Transaction

I’m trying to submit an already signed tx from cardano-cli using the cardano-wallet endpoint: https://localhost:8090/v2/proxy/transactions
the signed transaction look like this:
txBody = {
"type": "Tx MaryEra",
"description": "",
"cborHex": "83a400818258202d7928a59fcba5bf71c40fe6428a301ffda4d2fa681e5357051970436462b89400018282583900c0e88694ab569f42453eb950fb4ec14cb50f4d5d26ac83fdec2c505d818bcebf1df51c84239805b8a330d68fdbc3c047c12bb4c3172cb9391a002b335f825839003d2d9ceb1a47bc1b62b7498ca496b16a7b4bbcc6d97ede81ba8621ebd6d947875fcf4845ef3a5f08dd5522581cf6de7b9c065379cbb3754d1a001e8480021a00029361031a01672b7ea1008182582073dd733cc50d594cb89d3ac67287b02dae00982fc800e9c9c9d1bb282b56122558404d0cb4e4f1cc415ddcf546871f075d0ca6e0c2620cd784b06c21c9b86e4403cb7a115038487576dcb20e7820e9d0dc93ab2a737ed9d0a71a77bc1e12f7c4dd0ef6"
}
I just don’t know how to pass it to the endpoint using Content-Type application/octet-stream. The API doc says the payload should be:
string <binary>
Signed transaction message binary blob.
I’m using javascript for this and have tried passing the cborHex directly, using Buffer.from(txBody.cborHex).toString('base64') and the whole json Buffer.from(JSON.stringify(txBody)).toString('base64') but always got the same response:
{
"code": "malformed_tx_payload",
"message": "I couldn't verify that the payload has the c…node. Please check the format and try again."
}
Also I’ve noticed from the swagger specification that the endpoint support a JSON payload and taking a look to the cardano-wallet's source code here:
newtype PostExternalTransactionData = PostExternalTransactionData
{ payload :: ByteString
} deriving (Eq, Generic, Show)
I thought the structure should be some like this:
{
"payload": ?// some binary blob here that I can't find. I've tried with:
// Buffer.from(txBody.cborHex).toString('base64') and
// Buffer.from(JSON.stringify(txBody)).toString('base64')
}
Any idea how to construct the payload and pass the signed tx?
This code tells me that when decoding external (sealed) transaction the wallet tries Base 16 encoding first and if that fails then it tries Base 64:
instance FromJSON (ApiT SealedTx) where
parseJSON v = do
tx <- parseSealedTxBytes #'Base16 v <|> parseSealedTxBytes #'Base64 v
pure $ ApiT tx
After that BytesString is passed to this function https://github.com/input-output-hk/cardano-node/blob/5faa1d2bb85ae806ec51fa4c576dec2670c67c7a/cardano-api/src/Cardano/Api/SerialiseCBOR.hs#L32 together with the currentNodeEra that node is running.
(Each era has a different way of decoding)
It could be (I am not sure) that node is running say Alonzo but you're submitting a Mary-encoded Tx. In which case the decoding might fail.
I hope this helps.

Pytorch LSTM in ONNX.js - Uncaught (in promise) Error: unrecognized input '' for node: LSTM_4

I am trying to run a Pytorch LSTM network in browser. But I am getting this error:
graph.ts:313 Uncaught (in promise) Error: unrecognized input '' for node: LSTM_4
at t.buildGraph (graph.ts:313)
at new t (graph.ts:139)
at Object.from (graph.ts:77)
at t.load (model.ts:25)
at session.ts:85
at t.event (instrument.ts:294)
at e.initialize (session.ts:81)
at e.<anonymous> (session.ts:63)
at onnx.min.js:14
at Object.next (onnx.min.js:14)
How can I resolve this? Here is my code for saving the model to onnx:
net = torch.load('trained_model/trained_model.pt')
net.eval()
with torch.no_grad():
input = torch.tensor([[1,2,3,4,5,6,7,8,9]])
h0, c0 = net.init_hidden(1)
output, (hn, cn) = net.forward(input, (h0,c0))
torch.onnx.export(net, (input, (h0, c0)), 'trained_model/trained_model.onnx',
input_names=['input', 'h0', 'c0'],
output_names=['output', 'hn', 'cn'],
dynamic_axes={'input': {0: 'sequence'}})
I put input as the only dynamic axis since it is the only one that can vary in size. With this code, the model saves properly as trained_model.onnx. It does give me a warning:
UserWarning: Exporting a model to ONNX with a batch_size other than 1, with a variable length with LSTM can cause an error when running the ONNX model with a different batch size. Make sure to save the model with a batch size of 1, or define the initial states (h0/c0) as inputs of the model.
warnings.warn("Exporting a model to ONNX with a batch_size other than 1, "
This warning a little confusing since I am exporting it with a batch_size of 1:
input has shape torch.Size([1, 9])
h0 has shape torch.Size([2, 1, 256]) - corresponding to (num_lstm_layers, batch_size, hidden_dim)
c0 also has shape torch.Size([2, 1, 256])
But since I do define h0/c0 as inputs of the model I don't think this relates to the problem.
This is my javascript code for running in the browser:
<script src="https://cdn.jsdelivr.net/npm/onnxjs/dist/onnx.min.js"></script>
<!-- Code that consume ONNX.js -->
<script>
// create a session
const myOnnxSession = new onnx.InferenceSession();
console.log('trying to load the model')
// load the ONNX model file
myOnnxSession.loadModel("./trained_model.onnx").then(() => {
console.log('successfully loaded model!')
// after this I generate input and run the model
// since my code fails before this it isn't relevant
});
</script>
Based on the console.log statements, it is failing to load to the model. How should I resolve this? If relevant I'm using Python 3.8.5, Pytorch 1.6.0, ONNX 1.8.0.
For anyone coming across this in the future, I believe I'm getting this error because even though ONNX supports Pytorch LSTM networks, ONNX.js does not support it yet.
To get around this, instead of running in the browser I may use a simple web application framework called streamlit.

Display power usage as temperature with Node-Red in Apple HomeKit

I would like to display the power usage in HomeKit. Unfortunately there is no category to do that in HomeKit. That's why I had the idea to display this not as a power usage but as temperature in HomeKit. The idea is to control HomeKit scenes with the fake temperature sensor.
Unfortunately I have no experience in node-red and it is new for me.
I got the following string from the electricity meter:
success: "true"
response: string
{
"power": 3.040480,
"relay": true
}
I link this to the HomeKit Node which then returns the following error:
Characteristic response cannot be written.
Try one of these: Name, CurrentTemperature, StatusActive, StatusFault, StatusLowBattery, StatusTampered, Name
After various functions and other adjustments I unfortunately don't get the "temperature" displayed in HomeKit.
I use this:
https://flows.nodered.org/node/#plasma2450/node-red-contrib-homekit-bridged
I think you cannot directly link up the 2 nodes. From the error message it suggests that you have passed the payload from meter output to HomeKit node. While the meter output contains response property and it is not supported by HomeKit, your error would occur.
Make sure your payload only consists of the supported Characteristics. You can use a change node to modify the payload, or simply with a a function node
const msg.meterOutput = msg.payload;
// Just a prototype, type checking is required
msg.payload = {
currentTemperature: msg.meterOutput.response.power
}
return msg;

JSON Formatting from Particle Photon Webhook to Azure

I have hooked up my Particle Photon to log my temperature and then publishing the events to eventhub at Microsoft Azure. Then i use Stream Analytics to output the JSON file into Azure Storage (Good idea? or not?).
When i try to open it with <script src="URL"></script> through HTML, i get " Uncaught Syntax error: Unexpected token : " in the browser console window.
I also tried to validate my JSON file with JSON formatter and got alot of errors.
Here is the JSON file: https://pptlbhstorage.blob.core.windows.net/temperature/0_d1e8a2b709b14461b5ac12265f33020b_1.json
From the stream analytics i created a job with this query:
CREATE TABLE pptlbhhub (
coreid nvarchar(max),
data nvarchar(max),
event nvarchar(max),
EventEnqueuedUtcTime datetime,
EventProcessedUtcTime datetime,
measurename nvarchar(max),
PartitionId bigint,
published_at datetime,
subject nvarchar(max),
timecreated datetime,
unitofmeasure nvarchar(max),
value float
);
SELECT
coreid
,event
,EventEnqueuedUtcTime
,EventProcessedUtcTime
,measurename
,PartitionId
,published_at
,subject
,timecreated
,unitofmeasure
,value
INTO
pptlbhstorage
FROM
pptlbhhub;
I mean your file should look like this to be JSON valid (like an array).
[{"coreid":"1e0041000c47343432313031","displayname":"IoT Assignment 3","event":"PublishToEventHub","eventenqueuedutctime":"2016-05-12T08:56:20.5300000Z","eventprocessedutctime":"2016-05-12T08:56:21.3068971Z","guid":"1e0041000c47343432313031","location":"Oslo","measurename":"Temperature","organization":"Westerdals ACT","partitionid":0,"published_at":"2016-05-12T08:56:21.0850000Z","subject":"Weather","timecreated":"2016-05-12T08:56:21.0850000Z","unitofmeasure":"F","value":21.0},
{"coreid":"1e0041000c47343432313031","displayname":"IoT Assignment 3","event":"PublishToEventHub","eventenqueuedutctime":"2016-05-12T08:56:20.5300000Z","eventprocessedutctime":"2016-05-12T08:56:21.3068971Z","guid":"1e0041000c47343432313031","location":"Oslo","measurename":"Temperature","organization":"Westerdals ACT","partitionid":0,"published_at":"2016-05-12T08:56:21.0850000Z","subject":"Weather","timecreated":"2016-05-12T08:56:21.0850000Z","unitofmeasure":"F","value":21.0}, ...]
Not like
{"coreid":"1e0041000c47343432313031","displayname":"IoT Assignment 3","event":"PublishToEventHub","eventenqueuedutctime":"2016-05-12T08:56:20.5300000Z","eventprocessedutctime":"2016-05-12T08:56:21.3068971Z","guid":"1e0041000c47343432313031","location":"Oslo","measurename":"Temperature","organization":"Westerdals ACT","partitionid":0,"published_at":"2016-05-12T08:56:21.0850000Z","subject":"Weather","timecreated":"2016-05-12T08:56:21.0850000Z","unitofmeasure":"F","value":21.0}
{"coreid":"1e0041000c47343432313031","displayname":"IoT Assignment 3","event":"PublishToEventHub","eventenqueuedutctime":"2016-05-12T08:56:20.5300000Z","eventprocessedutctime":"2016-05-12T08:56:21.3068971Z","guid":"1e0041000c47343432313031","location":"Oslo","measurename":"Temperature","organization":"Westerdals ACT","partitionid":0,"published_at":"2016-05-12T08:56:21.0850000Z","subject":"Weather","timecreated":"2016-05-12T08:56:21.0850000Z","unitofmeasure":"F","value":21.0}...
The difference is the "[]" at the beginning & the end of the stream and the "," between each element.
If you output the events as an array, your JSON file will be ok :
[{}, {}, ...]
instead of
{} {} {} ...
You might be interested Particle's 1st class integration with Azure IoT Hub to help you get Particle device data into Microsoft Azure. For more information on how to get this set up, check out:
https://docs.particle.io/tutorials/integrations/azure-iot-hub/

Struggling to build a JS/PHP validation function for my app

I have a web service that returns a JSON object when the web service is queried and a match is found, an example of a successful return is below:
{"terms":[{"term":{"termName":"Focus Puller","definition":"A focus puller or 1st assistant camera..."}}]}
If the query does not produce a match it returns:
Errant query: SELECT termName, definition FROM terms WHERE termID = xxx
Now, when I access this through my Win 8 Metro app I parson the JSON notation object using the following code to get a JS object:
var searchTerm = JSON.parse(Result.responseText)
I then have code that processes searchTerm and binds the returned values to the app page control. If I enter in a successful query that finds match in the DB everything works great.
What I can't work out is a way of validating a bad query. I want to test the value that is returned by var searchTerm = JSON.parse(Result.responseText) and continue doing what I'm doing now if it is a successful result, but then handle the result differently on failure. What check should I make to test this? I am happy to implement additional validation either in my app or in the web service, any advice is appreciated.
Thanks!
There are a couple of different ways to approach this.
One approach would be to utilize the HTTP response headers to relay information about the query (i.e. HTTP 200 status for a found record, 404 for a record that is not found, 400 for a bad request, etc.). You could then inspect the response code to determine what you need to do. The pro of this approach is that this would not require any change to the response message format. The con might be that you then have to modify the headers being returned. This is more typical of the approach used with true RESTful services.
Another approach might be to return success/error messaging as part of the structured JSON response. Such that your JSON might look like:
{
"result":"found",
"message":
{
"terms":[{"term":{"termName":"Focus Puller","definition":"A focus puller or 1st assistant camera..."}}]}
}
}
You could obviously change the value of result in the data to return an error and place the error message in message.
The pros here is that you don't have to worry about header modification, and that your returned data would always be parse-able via JSON.parse(). The con is that now you have extra verbosity in your response messaging.

Categories