Will JavaScript tag's src attribute follow HTTP redirects in all browsers - javascript

Let's say, a javascript tag's src attribute points to a redirect:
<script src="http://foo.com/foo.js"></script>
where http://foo.com/foo.js is a 301 redirect to https://foo.com/foo.js...
Will all browsers successfully load the JS file? I've noticed it seems to work in Chrome, Firefox, Safari, and IE9... but I'm just curious if this is something that's in a spec or just random...

You can check out the following topic on behavior of different browsers to handle 301 redirect:
Client Web Browser Behavior When Handling 301 Redirect

Loading resources for a webpage (be it script source, image source or whatever) is agnostic to how browser fetches it for you (using HTTP protocol over TCP/IP).
The only thing to be aware of here is that browser makes two request to download one resource & provided that script calls are blocking in browser, so it is not advised to use this strategy for long. For the 3 very basic reason we use 301s are:
Prettify URLs
Ensure Link equity
Resolve canonical issue.

Related

a website that is blocked from being embedded in an iFrame. I need a legal workaround that allows me to post my store on my website [duplicate]

I am developing a web page that needs to display, in an iframe, a report served by another company's SharePoint server. They are fine with this.
The page we're trying to render in the iframe is giving us X-Frame-Options: SAMEORIGIN which causes the browser (at least IE8) to refuse to render the content in a frame.
First, is this something they can control or is it something SharePoint just does by default? If I ask them to turn this off, could they even do it?
Second, can I do something to tell the browser to ignore this http header and just render the frame?
If the 2nd company is happy for you to access their content in an IFrame then they need to take the restriction off - they can do this fairly easily in the IIS config.
There's nothing you can do to circumvent it and anything that does work should get patched quickly in a security hotfix. You can't tell the browser to just render the frame if the source content header says not allowed in frames. That would make it easier for session hijacking.
If the content is GET only you don't post data back then you could get the page server side and proxy the content without the header, but then any post back should get invalidated.
UPDATE: 2019-12-30
It seem that this tool is no longer working! [Request for update!]
UPDATE 2019-01-06: You can bypass X-Frame-Options in an <iframe> using my X-Frame-Bypass Web Component. It extends the IFrame element by using multiple CORS proxies and it was tested in the latest Firefox and Chrome.
You can use it as follows:
(Optional) Include the Custom Elements with Built-in Extends polyfill for Safari:
<script src="https://unpkg.com/#ungap/custom-elements-builtin"></script>
Include the X-Frame-Bypass JS module:
<script type="module" src="x-frame-bypass.js"></script>
Insert the X-Frame-Bypass Custom Element:
<iframe is="x-frame-bypass" src="https://example.org/"></iframe>
The X-Frame-Options header is a security feature enforced at the browser level.
If you have control over your user base (IT dept for corp app), you could try something like a greasemonkey script (if you can a) deploy greasemonkey across everyone and b) deploy your script in a shared way)...
Alternatively, you can proxy their result. Create an endpoint on your server, and have that endpoint open a connection to the target endpoint, and simply funnel traffic backwards.
Yes Fiddler is an option for me:
Open Fiddler menu > Rules > Customize Rules (this effectively edits CustomRules.js).
Find the function OnBeforeResponse
Add the following lines:
oSession.oResponse.headers.Remove("X-Frame-Options");
oSession.oResponse.headers.Add("Access-Control-Allow-Origin", "*");
Remember to save the script!
As for second question - you can use Fiddler filters to set response X-Frame-Options header manually to something like ALLOW-FROM *. But, of course, this trick will work only for you - other users still won't be able to see iframe content(if they not do the same).

Can't embed a badge in HTML

I have a certification and a badge provided by Acclaim. I want to embed it in my personal website but it's not working. here's the code they provided:
<div data-iframe-width="150" data-iframe-height="270" data-share-badge-id="60615e70-6409-4752-9d77-3553a43d13d2" data-share-badge-host="https://www.youracclaim.com"></div>
<script type="text/javascript" async src="//cdn.youracclaim.com/assets/utilities/embed.js"></script>
but even when simply put onto an empty html:5 page, I get the error: Loading failed for the <script> with source “file:///assets/utilities/embed.js”.
What's the problem here? I'm not sure how Acclaim can provide a ready-to-paste script that's just simply not working, nothing shows up on the website. I'm guessing the problem is at the src... part, but don't know how to fix it.
If you're loading your page via file:, then protocol-relative URLs aren't going to work. The script tag has:
src="//cdn.youracclaim.com/assets/utilities/embed.js"
This should be changed to:
src="https://cdn.youracclaim.com/assets/utilities/embed.js"
You'll find though that when you're using an actual web server, this is a non-issue. The reason for the protocol-relative URLs is so that HTTP pages would use the HTTP version, and HTTPS would use the HTTPS version. This method is outdated anyway. HTTPS should be used everywhere, even if you're loading HTTPS JavaScript from an HTTP page.

Modify <meta> tag with JS (chrome extension) on response receiving

I have a Chrome extension that adds a panel to the page in the floating iframe (on extension button click). There's certain JS code that is downloaded from 3rd party host and needs to be executed on that page. Obviously there's XSS issue and extension needs to comply with content security policies for that page.
Previously I had to deal with CSP directives that are passed via request headers, and was able to override those via setting a hook in chrome.webRequest.onHeadersReceived. There I was adding my host URL to content-security-policy headers. It worked. Headers were replaced, new directives applied to the page, all good.
Now I discovered websites that set the CSP directives via <meta> tag, they don't use request headers. For example, app pages in iTunes https://itunes.apple.com/us/app/olympics/id808794344?mt=8 have such. There is also an additional meta tag with name web-experience-app/config/environment (?) that somewhat duplicates the values that are set in content of tag with http-equiv="Content-Security-Policy".
This time I am trying to add my host name into meta tag inside chrome.webNavigation.onCommitted or onCompleted events listeners (JS vanilla via chrome.tabs.executeScript). I also experimented with running the same code from the webrequest's onCompleted listener (at the last step of lifecycle according to https://developer.mozilla.org/en-US/Add-ons/WebExtensions/API/webRequest).
When I inspect the page after its load - I see the meta tags have changed. But when I click on my extension to start loading iframe and execute JS - console prints the following errors:
Refused to frame 'https://myhost.com' because it violates the following Content Security Policy directive: "frame-src 'self' *.apple.com itmss: itms-appss: itms-bookss: itms-itunesus: itms-messagess: itms-podcasts: itms-watchs: macappstores: musics: apple-musics:".
I.e. my tag update was not effective.
I have several questions: first, do I do it right? Am I doing the update at the proper event? When is the data content from meta tags being read in the page lifecycle? Will it be auto-applied after tag content change?
As of March 2018 Chromium doesn't allow to modify the responseBody of the request. https://bugs.chromium.org/p/chromium/issues/detail?id=487422#c29
"WebRequest API: allow extensions to read response body" is a ticket from 2015. It is not on a path of getting to be resolved and needs some work/help.
--
Firefox has the webRequest filter implementation that allows to modify the response body before the page's meta directives are applied.
https://developer.mozilla.org/en-US/Add-ons/WebExtensions/API/webRequest/filterResponseData
BUT, my problem is focused on fixing the Chrome extension. Maybe Chrome picks this up one day.
--
In general, the Chrome's extension building framework seems like not a reliable path of building a long term living software; with browser vendors changing the rules frequently, reacting to newly discovered threats, having no up-to-date supported cross-browser standard.
--
In my case, the possible way around this issue can be to throw all the JS code into the extension's source base. Such that there's no 3-rd party to connect to fetch and execute the JS (and conflict with/violate the CSP rules). Haven't explored this yet, as I expected to reuse the code & interactive components I am using in my main browser application.
I've been interested in the same things and here are a few aspects that perhaps could help:
chrome.debugger extension API with Fetch (or Network) domain can be used to modify responseBody: https://chromedevtools.github.io/devtools-protocol/tot/Fetch/
This is an example implementation: https://github.com/mr-yt12/Debugger-API-Fetch-example-Chrome-Extension
However, I'm facing a problem of Fetch.requestPaused not firing on the first page load: Chrome Extension Debugger API, Fetch domain attaches/enables too late for the response body to be intercepted
And I haven't found a solution to this yet, besides first redirecting the request to 'http://google.com/gen_204' and then updating the tab. But this creates flicker and also I'm not sure if it's possible to redirect the request like this with manifest v3.
When using debugger API, Chrome shows a warning at the top, which changes the page's size and doesn't go away (perhaps it goes away 5 seconds after the debugger is detached and also if the user clicks "cancel"). This means that it's mostly only good for personal use or when distributing as a developer mode extension. Using --silent-debugger-extension-api flag with Chrome disables this warning.
I've tried injecting my script at document_start (when the meta tag is not yet created). Then I used mutationObserver (also tried other methods) to wait for the meta tag, to modify it before it's applied or fully created. Somehow it succeeded once or a few times, but could be a coincidence or a wrong interpretation of results. Perhaps it's worth experimenting with it.
Another idea (but I think I didn't make it work, but perhaps it's possible) is to use window.stop() at document_start and then rewrite the html content programmatically. This needs more researching.
It seems that meta tag CSP is applied once the meta tag is created (or while it's being created), and then there is no way to cancel what's been applied. It should be researched more on how to prevent it from applying or modifying it before it's fully created or applied.

Firefox doesn't sent cookie in cross browser request declared in #font-face

What I want:
I need to add some dynamic CSS styles with fonts from different domain, where authentication is needed, to the page:
Problem:
Firefox doesn't send cookie in font request (In Chrome everything works fine)
Situation:
Firefox v.30.0. Source of CSS file and Font are on different domain (in example is foreign domain called "bbb.com"). Foreign domain "bbb" needs authentication. After successful authentication client receive cookie with php session ID, which is using for all others request to "bbb.com" domain.
Example:
Authentication for domain "bbb.com" has been successful executed
By AJAX request to url "https://bbb.com/issue2604dead.css" client receive content of CSS file like a string
Javascript create element with content received in previous step
CSS styles works fine and style of affected elements is changed;
One of definition in the CSS is following FONT reference: #font-face{font-family:dead_3;src:url(https://bbb.com/dead_3.ttf)format("opentype");}
Browser trying to get font from url but client receive response status "401 Unauthorized";
EDITED:
I made example on jsFiddle: http://jsfiddle.net/RightFiveLeft/Pu4C2/9/
I found Firefox doesn't sent Cookie because reference is set in DOM element. If you load same resource by ajax browser will sent cookie.
So in my example on jsFiddle Cookie will be not sent on CSS request created by DOM element link.
EDITED 2
I am sorry for I added wrong example in jsFiddle . Now in version 9 it should be finally correct :-D
You're right, there's something weird going on. On Safari (7.0.5), both the css and font request return 401.
If you open the font or css request in a new Safari tab, it returns 401. If you refresh the fiddle again now, both request work correctly (200).
On firefox (31.0), the font issue persists. This could be a firefox policy for requests to another domain, or maybe.. simply a bug.
I found something that could be related, but requires access to the webserver to test it:
CSS #font-face not working with Firefox, but working with Chrome and IE
By the way it looks like there are some differences and issues in the implementation of these requests even in other browsers, so I wouldn't rely too much on the authentication mechanism with cookie.
Another way could be change slightly the implementation of getFile.php, accepting a GET parameter (token) for the authentication. For example you could craft the request as:
.../getFile.php?fileName=testFont.ttf&_=1445&token=<auth token>
and test the authentication parameter against $_GET['token'] instead of cookie value.
If you can't or don't want to solve this way, I'd suggest you to evaluate the possibility to keep (only) the font without authentication. If this is a viable solution, the best option would be to serve it from a CDN.

Is there a way to mitigate downloading of resources (images/css and js files) with Javascript?

I have a html page on my localhost - get_description.html.
The snippet below is part of the code:
<input type="text" id="url"/>
<button id="get_description_button">Get description</button>
<iframe id="description_container" src="#"/>
When the button is clicked the src of the iframe is set to the url entered in the textbox. The pages fetched this way are very big with lots of linked files. What I am interested in the page is a block of text contained in a <div id="description"> element.
Is there a way to mitigate downloading of resources linked in the page that loads into the iframe?
I don't want to use curl because the data is only available to logged in users and the steps to take with curl to get the content is too complicated. The iframe is simple as I use this on a box which sends the right cookies to identify the request as coming from a logged in user, but the problem is that it is very wasteful to get nearly 1 MB of data to keep 1 KB of it and throw out the rest.
Edit
If the proposed method just works in Firefox it is fine, so I added Firefox tag. Also, it is possible that the answer actually is from the realm of Firefox add-on techniques, so I added that tag as well.
The problem is not that I cannot get at what I'm looking for, rather, the problem is the easy iframe method is wasteful.
I know that Firefox does allow loading only the text of a page. If you open a page and press Ctrl+U you are taken to 'view page source' window, There links behave as normal and are clickable, if you click on a link in source view, the source of the new page is loaded into the view source window, without the linked resources being downloaded, exactly what I'm trying to get. But I don't know how to access this behaviour.
Another example is the Adblock add-on. It somehow kills elements before they get loaded. With plain Javascript this is not possible. Because it only is triggered too late to intervene in good time.
The Same Origin Policy forbids any web page to access contents of any other web page in a different domain so basically you cannot do that.
However it seems that with some browsers it is allowed to access web pages content if you are trying to access it from a local web page which seems to be your case.
Safari, IE 6/7/8 are browser that allow a local web page to do so via XMLHttpRequest (source: Google Browser Security Handbook) so you may want to choose to use one of those browsers to do what you need (note that future versions of those browsers may not allow to do so anymore).
A part from this solution I only see two possibities:
If the web pages you need to fetch content from are somehow controlled by you, you can create a simpler interface to let other web pages to get the content you need (for example allowing JSONP requests).
If the web pages you need to fetch content from are not controlled by you the only solution I see is to fetch content server side logging in from the server directly (I know that you don't want to do so, but I don't see any other possibility if the previous I mentioned are not practicable)
Hope it helps.
Actually I've seen Cross Domain jQuery .load request before, here: http://james.padolsey.com/javascript/cross-domain-requests-with-jquery/
The author claims that codes like these found on that page
$('#container').load('http://google.com'); // SERIOUSLY!
$.ajax({
url: 'http://news.bbc.co.uk',
type: 'GET',
success: function(res) {
var headline = $(res.responseText).find('a.tsh').text();
alert(headline);
}
});
// Works with $.get too!
would work. (The BBC code might not work because of the recent redesign, but you get the idea)
Apparently it is using YQL wrapped into a jQuery plugin to do the trick. Now I cannot say I fully understand what he is doing there but it appears to work, and fits the bill. Once you load the data I suppose it is a simple matter of filtering out the data that you need.
If you prefer something that works at the browser level, may I suggest Mozilla's Jetpack framework for lightweight extensions. I've not yet read the documentations in its entirety but it should contain the APIs needed for this to work.
There are various ways to go about this in AJAX, I'm going to show the jQuery way for brevity as one option, though you could do this in vanilla JavaScript as well.
Instead of an <iframe> you can just use a container, let's say a <div> like this:
<div id="description_container"></div>
Then to load it:
$(function() {
$("#get_description_button").click(function() {
$("#description_container").load($("input").val() + " #description");
});
});
This uses the .load() method which takes a string in this format: .load("url selector"), then takes that element in the page and places it's content inside the container you're loading, in this case #description_container.
This is just the jQuery route, mainly to illustrate that yes, you can do what you want, but you don't have to do it exactly like this, just showing the concept is getting what you want from an AJAX request, rather than in an <iframe>.
Your description sounds like you are fetching pages from the same domain (you said that you need to be logged in and have session credentials) so have you tried to use async request via XMLHttpRequest? It might complain if the html on a page is particularly messed up but you chould still be able to get raw text via .responseText and extract what you need with a regex.

Categories