![]() | This article is rated B-class on Wikipedia's
content assessment scale. It is of interest to the following WikiProjects: | ||||||||||||||||||||||||||||
|
![]() | This article is substantially duplicated by a piece in an external publication. Please do not flag this article as a copyright violation of the following source:
|
Is there any need for the tiny images of each of the three types of vulnerability on the NVD, the images are impossible to see without opening the full versions of them, and they add nothing to the article. I'm going to remove the images. If anyone disagrees they can revert and tell me why. Also I believe the exploit scenarios should be merged into the descriptions of each type of attack but I will not do that right now-- WikiSolved ( talk) 18:41, 25 June 2009 (UTC)
CLARIFICATION NEEDED ON Type-0 attack: In this section under the subsection Type-0 attack bullet #3 it says, "The malicious web page's JavaScript opens a vulnerable HTML page installed locally on Alice's computer." There is no explanation of how the vulnerable HTML page got installed locally on Alice's computer or how Mallory knew about it. This is the crux of this attack so without this this part of the explanation this scenario is not useful. I haven't found an answer to this or I would have corrected the article. I'm hoping someone else will read this who has more knowledge of this attach and add the clarifying information.
The first paragraph in section:"Other forms of mitigation" is garbage. Just quoting text will not stop it from being interpretted as html. I can always put "> into the text to close the tag. That whole section should be either removed or heavily modified. It is naive and innaccurate. -- 129.97.84.62 15:06, 4 April 2006 (UTC)
"Cross-site scripting (XSS) is a type of computer security vulnerability [...] XSS enables attackers to inject client-side script into Web pages". Is this correct in the first place??? I think the term Cross Site Scripting in itself only describes a page on one site containing a script hosted on another site (or is it having a script hosted on a site sending/loading data to/from another site? Not sure), which may or may not imply a vulnerability. I think what is described in this article is a particular use (or abuse) of XSS. Perhaps it should be calles "XSS Injection", or something like that?
Or am I wrong? Teo8976 ( talk) 21:24, 26 January 2015 (UTC)
There should be some mention of the two different approaches -- blacklisting (i.e., removing anything that can be recognised as a potential script injection) and whitelisting (i.e., only allowing stuff that can be determined not to be a potential script injection). If I had references for this kind of stuff, I'd add it myself, but I came here looking for them. :( JulesH 17:10, 27 July 2006 (UTC)
I like the recent example of PayPal's XSS hole. However, it isn't mentioned what type it is. Is it a type 1 XSS? If so, we can probably remove the ATutor example, since it isn't very well known, and replace it with the PayPal one. We should also keep the number of examples down to 4-5 if possible. It could easily grow to 1000 if everyone put their favorite in, but we don't need that. -- TDM 13:32, 8 August 2006 (UTC)
Someone added the notice recently that this page may not meet Wikipedia's standards due to the need for restructuring. Could whoever added that elaborate? I don't see any new comments here specific to that evaluation. If there's some other organization that would work better, I'd be willing to improve the document. TDM 23:57, 19 October 2006 (UTC)
I'd like to add a link to HTML Purifier in the Prevention section, as it implements the most reliable method: parsing and stripping all tags/attributes not in the whitelist (as well as other protection). Unfortunately, I wrote the library, so if I put it on it's vanity. So could someone take a look and, if it looks useful, add the link for me? Thanks! — Edward Z. Yang( Talk) 23:36, 29 November 2006 (UTC)
I'm not familiar enough with this article to know exactly where this should go, but I think this presentation of a Google Desktop vulnerability is extremely educational - they show how such small vulnerabilities in this case end up cascading into complete control over the victim's computer. The vulnerabilities they use are all patched (I think including one glitch that's server-side), so they no longer work, so it should be safe to show. This sounds like Type 2 in the article. — AySz88 \ ^-^ 20:39, 22 February 2007 (UTC)
I was almost certain we'd previously had a discussion on this, but obviously this is not the case. So, I'll bring it to the floor now.
I am strongly opposed to including the acronym "CSS" in the introduction paragraph of the article. It is misleading term that no one uses anymore, as the Terminology statement already states, and thus, while deserving mention in that segment, should not be in the intro. — Edward Z. Yang( Talk) 22:42, 28 February 2007 (UTC)
It's well-structured, concise, disambiguating, sufficiently detailed, and very clear. 64.221.248.17 22:24, 6 April 2007 (UTC) d following information is so good.
Disagree. The "Persistent" section is stupid. One, Mallory is a girl's name. Two, hackers do not watch football. Three, hackers would not use a dating website to find mates, if a hacker needed a mate they would use IRC or a relatively unknown underground chat program. Four, a hacker would use a girl's name for the purpose of elite deception. With this level of incongruity to reality, you might as well talk about Batman and Lex Luthor battling demons in space. — Preceding unsigned comment added by 69.1.35.199 ( talk) 15:47, 16 June 2011 (UTC)
called XSS. I don't know how to do disambiguation pages, and I'm not an expert on XSS (that's why I was looking them up), but maybe someone can help clarify this? All I know about XSS is that they sing sort of hip-hop style R&B in English, and that they're at least popular in the middle east.
Are XSS and the difficulty with interpreting and reformatting HTML some of the reasons why wikis don't use HTML for formatting? I know that one reason for not using HTML is that it might be difficult for some wiki users to learn. But it seems that the wiki formatting also helps prevent XSS while giving the users some control. -- Lance E Sloan 16:57, 8 August 2007 (UTC)
I have been getting strange XSS warnings in FF 2.0.0.6 from wikipedia articles with images lately. Does anyone know if there has been a change in the template formatting of images or if its a FF bug?
isn't cross site scripting really an attack and not a vulnerability? the vulnerability is most clearly input validation. the attack is script injection, of which cross site scripting is a a specific type of injection. do we agree? —Preceding unsigned comment added by 198.169.188.227 ( talk) 19:32, 5 September 2007 (UTC)
When you tag the page as "needs cleanup", "needs citations", "needs an expert", or whatever, please include a description here of the specific criticisms. I consider myself an expert on this topic and have wrote most of the content for the page. However, I'm a busy guy and only have time to look at the page once every few months. I certainly don't have time to read up on every Wikipedia policy regarding format, so please describe your gripes rather than just doing a hit-and-run tag like that. I can guess the citations issue could be resolved by adding inline external links or footnote tags. Certainly there are plenty (too many) of external resources listed at the bottom that could be better referenced internally to back up the page's assertions. However, the "needs an expert" tag confuses me. TDM ( talk) 19:55, 17 April 2008 (UTC)
I think this section is currently pretty weak. For one, the Python example can probably go away, since it isn't an ideal filter. Perhaps it would be better to start with a more abstract description of how to do white-list based character encoding (i.e., all characters except those in a white list get encoded), then move on to some algorithms or examples of libraries that already do this. Finally, I think it's important to include a note on defining a page's character set to prevent UTF-7 based attacks. There are very few good references online for this... mostly just specific vulnerabilities and associated exploits. I might get around to this rewrite at some point, but feel free to give it a go if anyone is interested. TDM ( talk) 18:01, 23 May 2008 (UTC)
Hi. The "External links" section was tagged since last November so I removed it except for a couple. In case anyone needs them, here they all are. — SusanLesch ( talk) 17:46, 28 May 2008 (UTC)
From the Mitigation section, this was cut only because OpenAjax recommends iframe and I don't know how to reconcile the two thoughts. Maybe someone else would be able to restore this sentence if it needs to be there. Thanks. — SusanLesch ( talk) 06:10, 9 June 2008 (UTC)
References
{{
cite web}}
: Check date values in: |date=
(
help)
Shouldn't specific tactics be listed? For example, a server which supports HTTP TRACE (or is accessed via such a proxy server [3]) is probably vulnerable to XST (cross-site tracing) [4]. -- Jesdisciple ( talk) 23:35, 1 October 2008 (UTC)
Due to the general requirement of the use of some social engineering in this case (and normally in Type 0 vulnerabilities as well), many programmers have disregarded these holes as not terribly important. This misconception is sometimes applied to XSS holes in general (even though this is only one type of XSS) and there is often disagreement in the security community as to the importance of cross-site scripting vulnerabilities.
This section misses the entire point. If I wanted to grab cookies from users using some form of local JavaScript exploit (just an example, there are hundreds of other things possible), I could use a type 1 vulnerability on a forum or popular website in order to attack the maximum number of users. The social engineering side of it is therefore a side-note, as it only applies when an attacker is targeting one specific user - and even then if they know they frequent that site they don't have to social engineer them at all.
Furthermore, what's with the names in the exploit scenarios? I'm all for equality but I'd like to see names in there that are at least somewhat universally pronounceable and single-barrelled. Try something like Anne instead of Lorenzo de Medici. 86.14.89.251 ( talk) 10:58, 5 October 2008 (UTC)
In the section "Eliminating scripts", our article says:
I just added a {{dubious}} tag because I frankly don't think it is true, and the citation given doesn't support the claim! The supporting cite is:
However, the quote in BBC article does not claim that 73% of websites rely on JavaScript, it says that "A further 73% failed to make the grade because of their reliance on JavaScript for some of the website's functionality." (My emphasis.) The difference is subtle but critical; if I say in an unqualified way that "mechanism A relies on B", it will be understood to mean that without B, A doesn't work at all. However if we say that "mechanism A relies on B for some functions", then clearly A still works without B.
I think that this weaker claim is perfectly plausible -- but also highly misleading, because in many cases, the lost functionality is inconsequential, or even annoying. I have been using NoScript for nearly three years now, and while it is probably true that only 27% of websites make no use of JavaScript whatsoever, it certainly is not true that the other 73% all require it in order to work. I am a net junky, but I have only 11 sites whitelisted (apart from the defaults); all the scores of other things I do on the net just don't need it. In fact, my subjective impression is that overwhelmingly the most common usage of JavaScript is for randomisation of ad loading; so on those sites, disabling scripts does nothing but speed up page loading and reduce clutter. The next most common usage would be form validation -- the absence of which you will generally not even notice (especially as all the most common types of data entry errors aren't detected by client side scripts anyway.) I don't know what fraction of websites critically depend on JavaScript in order to work at all (if I knew, I'd just edit the article), but it's nothing like 73%. -- Securiger ( talk) 05:36, 8 October 2008 (UTC)
Is there any scope in this article for a non-technical explanation? What does "injecting" scripts mean? Can it be explained in non-technical language? I would request a short section with a "non-technical explanation". (Or has this issue been dealt with somewhere? The need, or not, of providing non-technical explanations...) Devadaru ( talk) 16:33, 23 January 2009 (UTC)
The list of prominent domains hacked seems like it's promoting XSS hackers' glory. I don't think the New Zealand herald, for example, or an obama discussion forum, are nearly as prominent as google or yahoo. Maybe we could cull the list down to 4 or 5. —Preceding unsigned comment added by 69.71.99.90 ( talk) 16:51, 5 March 2009 (UTC)
Sometimes one would like to use XSS explicitly to communicate different apps. HTML5 (IE8, FF3) support window.postMessage, that is told to support cross-site messaging. I think there should be a reference to all that stuff in the article. 200.75.30.10 ( talk) 14:06, 17 December 2009 (UTC)
In April 2010 an XSS vulnerability in JIRA was the stepping stone to the compromise of key Apache Software Foundation servers [1]. The method used in this XSS doesn't seem to match any of the ones in the article. -- Walter Görlitz ( talk) 22:36, 7 May 2010 (UTC)
References
The example at the end of the non-persistent section is actually a persistent attack. It is confusing. —Preceding unsigned comment added by 204.176.49.45 ( talk) 17:35, 5 May 2011 (UTC)
I see three major problems of the current article:
1. Although I am not a native English speaker, I am sure the author of the major part of this article is not a native English speaker. And more importantly, he/she is not capable of creating a wiki quality article in English. Examples: No example. You may read through the article to feel it.
2. The technical terms used in this article are non-standardized and even not consistent within the article itself. Some examples:
- “Cross-site scripting holes are web-application vulnerabilities”, “In recent years, cross-site scripting flaws”
- “Besides content filtering”
What is “content filtering”? I actually understand what it means. But why not just uses the terms which have been used in previous sections such as “output encoding” or “input validation”? This article is full of this kind of inconsistent terms.
3. The article has quite a lot of technical errors. Examples:
- The section about DOM based XSS is essentially wrong.
The author should read the OWASP link https://www.owasp.org/index.php/DOM_Based_XSS carefully and discuss with domain expert to ensure he really understands DOM based XSS before creating the section.
- “Safely validating untrusted HTML input”
This section itself is right but misleading. ALL input data from user / network / un-trusted source should be validated – not only the HTML input.
- “Some companies offer a periodic scan service, essentially simulating an attack from their server to a client's in order to check if the attack is successful”
Just one attack? No. The security scanning / assessment service generally tries all kinds of possible attacking types and vectors not just “an attack”.
- “There are several different escaping schemes … including HTML entity encoding, JavaScript escaping, CSS escaping, and URL (or percent) encoding”
This sentence missed a word “etc.” at the end of the list. There are more encoding schemes for XSS prevention besides the listed 4.
- “Most web applications that do not need to accept rich data can use escaping to largely eliminate the risk of XSS in a fairly straightforward manner.”
- “Encoding can be tricky”
Above two statements are self-contradict. Is encoding (escaping) fairly straightforward or tricky?
- “When Bob reads the message, Mallory's XSS steals Bob's cookie.”
No. It is not Mallory’s XSS. It is the web site’s XSS. And it is the script code injected by Mallory which steals Bob’s cookie. The wording here is too casual, too loose for a wiki article.
- “External links” section, the link “Simple XSS explanation”
Why the link is there? It points to a very poor quality paper - it is far from a qualified reference for a wiki page.
- Etc.
In summary, I suggest a total rewriting of the whole article – the OWASP page
https://www.owasp.org/index.php/XSS could be a good reference.
Condor Lee ( talk) 22:03, 5 June 2012 (UTC) Condor Lee
quote:
Some browsers or browser plugins can be configured to disable client-side scripts
i've never seen a browser that cannot disable javascripts.. if no1 can name a browser that cannot disable scripts, i will change that statement.. (im NOT talking about plugin/integrated browsers, i can mention 1 right now: Valve's Steam's integrated browser ;p) all mainstream browsers at least, including Opera, Firefox, Internet Explorer, Safari all allows this. Divinity76 ( talk) 12:02, 27 June 2012 (UTC)
IE on Windows Phone 8 does not allow disabling Javascript. — Preceding unsigned comment added by 162.212.105.3 ( talk) 03:25, 16 October 2014 (UTC)
I wonder do we confuse "XSS" term with "XSS vulnerability" term (or "XSS attacks", or "XSS based attacks", etc.) in this article?
Are "XSS" and "XSS vulnerability" really the same?
Michael V. Antosha (mivael) ( talk) 07:05, 11 October 2012 (UTC)
Should there be a section in Prevention about client-side filters such as XSS Auditor (Chrome) and NoScript (Firefox)? NoScript is mentioned for its ability to block all scripting, but it also includes a specific and advanced XSS filter, even if you enable JavaScript. XSS Auditor is less comprehensive (it has to be - it's targeting all Chrome users, instead of a security-minded subset of Firefox users), but it's significant. Carl.antuar ( talk) 06:15, 5 March 2014 (UTC)
Just an observation, but in view of the fact that pages cannot access cross-site cookies, I was quite suprprised -and indeed alarmed- to see that the same does not apply to Javascript. If script URLS could only be local files or URLs of the same root domain as the hosting page, that would nail all of the exploits bar the inline scripting ones, which are limited in scope without a foreign script to call. The existing situation seems to be a classic example of feature-bloat taking precedence over security. Though, I daresay a change now would break a lot of existing sites that rely on foreign scripts, for example jQuery being loaded from Google.
Mozilla browsers typically have an option to allow no cookies, local cookies or all cookies. Why not have the same for Javascript? -- Anteaus ( talk) 20:54, 28 May 2014 (UTC)
I think it's worth mentioning BBCode as a solution to avoid visitors injecting HTML tags such as <script> into their comments. Something commonly used on (nearly all) modern forum softwares. — Preceding unsigned comment added by 96.11.182.98 ( talk) 15:49, 28 December 2015 (UTC)
Perhaps there should be mention of so-called "semi-persistent" XSS, where the server negligently stores the payload in a cookie (or other client-side storage) that will later be used to assemble pages: http://research.zscaler.com/2009/05/cookie-based-persistent-xss.html Carl.antuar ( talk) 23:06, 24 January 2016 (UTC)
In some sections, "Mallory" is used as a woman's name and others it's used as a man's name. — Preceding unsigned comment added by Khatchad ( talk • contribs) 16:31, 2 March 2016 (UTC)
Several things could have been done to mitigate this attack:
The search input could have been sanitized which would include proper encoding checking. The web server could be set to redirect invalid requests.
There's no invalid request here.
Khatchad ( talk) 16:34, 2 March 2016 (UTC)
Hello fellow Wikipedians,
I have just modified one external link on Cross-site scripting. Please take a moment to review my edit. If you have any questions, or need the bot to ignore the links, or the page altogether, please visit this simple FaQ for additional information. I made the following changes:
{{
dead link}}
tag to
http://www.modsecurity.org/projects/modsecurity/apache/feature_universal_pdf_xss.htmlWhen you have finished reviewing my changes, you may follow the instructions on the template below to fix any issues with the URLs.
This message was posted before February 2018.
After February 2018, "External links modified" talk page sections are no longer generated or monitored by InternetArchiveBot. No special action is required regarding these talk page notices, other than
regular verification using the archive tool instructions below. Editors
have permission to delete these "External links modified" talk page sections if they want to de-clutter talk pages, but see the
RfC before doing mass systematic removals. This message is updated dynamically through the template {{
source check}}
(last update: 5 June 2024).
Cheers.— InternetArchiveBot ( Report bug) 09:52, 9 December 2017 (UTC)
Hey User:Jmccormac,
I saw that you reverted my edit to a broken link & noted that it was a promotional link. It wasn't meant to be promotional. It was a recent blog post that I wrote covering the Cross-Site Scripting bug. The link in wikipedia was broken, but in addition to that when using waybackmachine to check the broken link the content was outdated & lacking important points. Important details such as the different forms of XSS as well as the steps to mitigate the vulnerability were missing. The blog post I replaced the broken link covered those key points.
I did go back and review the blog post that I used to replace the broken link to figure out what might've made it 'promotional' -- It did have a form at the bottom to sign up for notifications of any future posts, which I removed this morning.
Please let me know if there are any other concerns about the article & how it might be considered promotional. If not, I'd appreciate the chance to have my replaced link stay on the article. Thanks! — Preceding unsigned comment added by X-Security-Austin ( talk • contribs)
![]() | This article is rated B-class on Wikipedia's
content assessment scale. It is of interest to the following WikiProjects: | ||||||||||||||||||||||||||||
|
![]() | This article is substantially duplicated by a piece in an external publication. Please do not flag this article as a copyright violation of the following source:
|
Is there any need for the tiny images of each of the three types of vulnerability on the NVD, the images are impossible to see without opening the full versions of them, and they add nothing to the article. I'm going to remove the images. If anyone disagrees they can revert and tell me why. Also I believe the exploit scenarios should be merged into the descriptions of each type of attack but I will not do that right now-- WikiSolved ( talk) 18:41, 25 June 2009 (UTC)
CLARIFICATION NEEDED ON Type-0 attack: In this section under the subsection Type-0 attack bullet #3 it says, "The malicious web page's JavaScript opens a vulnerable HTML page installed locally on Alice's computer." There is no explanation of how the vulnerable HTML page got installed locally on Alice's computer or how Mallory knew about it. This is the crux of this attack so without this this part of the explanation this scenario is not useful. I haven't found an answer to this or I would have corrected the article. I'm hoping someone else will read this who has more knowledge of this attach and add the clarifying information.
The first paragraph in section:"Other forms of mitigation" is garbage. Just quoting text will not stop it from being interpretted as html. I can always put "> into the text to close the tag. That whole section should be either removed or heavily modified. It is naive and innaccurate. -- 129.97.84.62 15:06, 4 April 2006 (UTC)
"Cross-site scripting (XSS) is a type of computer security vulnerability [...] XSS enables attackers to inject client-side script into Web pages". Is this correct in the first place??? I think the term Cross Site Scripting in itself only describes a page on one site containing a script hosted on another site (or is it having a script hosted on a site sending/loading data to/from another site? Not sure), which may or may not imply a vulnerability. I think what is described in this article is a particular use (or abuse) of XSS. Perhaps it should be calles "XSS Injection", or something like that?
Or am I wrong? Teo8976 ( talk) 21:24, 26 January 2015 (UTC)
There should be some mention of the two different approaches -- blacklisting (i.e., removing anything that can be recognised as a potential script injection) and whitelisting (i.e., only allowing stuff that can be determined not to be a potential script injection). If I had references for this kind of stuff, I'd add it myself, but I came here looking for them. :( JulesH 17:10, 27 July 2006 (UTC)
I like the recent example of PayPal's XSS hole. However, it isn't mentioned what type it is. Is it a type 1 XSS? If so, we can probably remove the ATutor example, since it isn't very well known, and replace it with the PayPal one. We should also keep the number of examples down to 4-5 if possible. It could easily grow to 1000 if everyone put their favorite in, but we don't need that. -- TDM 13:32, 8 August 2006 (UTC)
Someone added the notice recently that this page may not meet Wikipedia's standards due to the need for restructuring. Could whoever added that elaborate? I don't see any new comments here specific to that evaluation. If there's some other organization that would work better, I'd be willing to improve the document. TDM 23:57, 19 October 2006 (UTC)
I'd like to add a link to HTML Purifier in the Prevention section, as it implements the most reliable method: parsing and stripping all tags/attributes not in the whitelist (as well as other protection). Unfortunately, I wrote the library, so if I put it on it's vanity. So could someone take a look and, if it looks useful, add the link for me? Thanks! — Edward Z. Yang( Talk) 23:36, 29 November 2006 (UTC)
I'm not familiar enough with this article to know exactly where this should go, but I think this presentation of a Google Desktop vulnerability is extremely educational - they show how such small vulnerabilities in this case end up cascading into complete control over the victim's computer. The vulnerabilities they use are all patched (I think including one glitch that's server-side), so they no longer work, so it should be safe to show. This sounds like Type 2 in the article. — AySz88 \ ^-^ 20:39, 22 February 2007 (UTC)
I was almost certain we'd previously had a discussion on this, but obviously this is not the case. So, I'll bring it to the floor now.
I am strongly opposed to including the acronym "CSS" in the introduction paragraph of the article. It is misleading term that no one uses anymore, as the Terminology statement already states, and thus, while deserving mention in that segment, should not be in the intro. — Edward Z. Yang( Talk) 22:42, 28 February 2007 (UTC)
It's well-structured, concise, disambiguating, sufficiently detailed, and very clear. 64.221.248.17 22:24, 6 April 2007 (UTC) d following information is so good.
Disagree. The "Persistent" section is stupid. One, Mallory is a girl's name. Two, hackers do not watch football. Three, hackers would not use a dating website to find mates, if a hacker needed a mate they would use IRC or a relatively unknown underground chat program. Four, a hacker would use a girl's name for the purpose of elite deception. With this level of incongruity to reality, you might as well talk about Batman and Lex Luthor battling demons in space. — Preceding unsigned comment added by 69.1.35.199 ( talk) 15:47, 16 June 2011 (UTC)
called XSS. I don't know how to do disambiguation pages, and I'm not an expert on XSS (that's why I was looking them up), but maybe someone can help clarify this? All I know about XSS is that they sing sort of hip-hop style R&B in English, and that they're at least popular in the middle east.
Are XSS and the difficulty with interpreting and reformatting HTML some of the reasons why wikis don't use HTML for formatting? I know that one reason for not using HTML is that it might be difficult for some wiki users to learn. But it seems that the wiki formatting also helps prevent XSS while giving the users some control. -- Lance E Sloan 16:57, 8 August 2007 (UTC)
I have been getting strange XSS warnings in FF 2.0.0.6 from wikipedia articles with images lately. Does anyone know if there has been a change in the template formatting of images or if its a FF bug?
isn't cross site scripting really an attack and not a vulnerability? the vulnerability is most clearly input validation. the attack is script injection, of which cross site scripting is a a specific type of injection. do we agree? —Preceding unsigned comment added by 198.169.188.227 ( talk) 19:32, 5 September 2007 (UTC)
When you tag the page as "needs cleanup", "needs citations", "needs an expert", or whatever, please include a description here of the specific criticisms. I consider myself an expert on this topic and have wrote most of the content for the page. However, I'm a busy guy and only have time to look at the page once every few months. I certainly don't have time to read up on every Wikipedia policy regarding format, so please describe your gripes rather than just doing a hit-and-run tag like that. I can guess the citations issue could be resolved by adding inline external links or footnote tags. Certainly there are plenty (too many) of external resources listed at the bottom that could be better referenced internally to back up the page's assertions. However, the "needs an expert" tag confuses me. TDM ( talk) 19:55, 17 April 2008 (UTC)
I think this section is currently pretty weak. For one, the Python example can probably go away, since it isn't an ideal filter. Perhaps it would be better to start with a more abstract description of how to do white-list based character encoding (i.e., all characters except those in a white list get encoded), then move on to some algorithms or examples of libraries that already do this. Finally, I think it's important to include a note on defining a page's character set to prevent UTF-7 based attacks. There are very few good references online for this... mostly just specific vulnerabilities and associated exploits. I might get around to this rewrite at some point, but feel free to give it a go if anyone is interested. TDM ( talk) 18:01, 23 May 2008 (UTC)
Hi. The "External links" section was tagged since last November so I removed it except for a couple. In case anyone needs them, here they all are. — SusanLesch ( talk) 17:46, 28 May 2008 (UTC)
From the Mitigation section, this was cut only because OpenAjax recommends iframe and I don't know how to reconcile the two thoughts. Maybe someone else would be able to restore this sentence if it needs to be there. Thanks. — SusanLesch ( talk) 06:10, 9 June 2008 (UTC)
References
{{
cite web}}
: Check date values in: |date=
(
help)
Shouldn't specific tactics be listed? For example, a server which supports HTTP TRACE (or is accessed via such a proxy server [3]) is probably vulnerable to XST (cross-site tracing) [4]. -- Jesdisciple ( talk) 23:35, 1 October 2008 (UTC)
Due to the general requirement of the use of some social engineering in this case (and normally in Type 0 vulnerabilities as well), many programmers have disregarded these holes as not terribly important. This misconception is sometimes applied to XSS holes in general (even though this is only one type of XSS) and there is often disagreement in the security community as to the importance of cross-site scripting vulnerabilities.
This section misses the entire point. If I wanted to grab cookies from users using some form of local JavaScript exploit (just an example, there are hundreds of other things possible), I could use a type 1 vulnerability on a forum or popular website in order to attack the maximum number of users. The social engineering side of it is therefore a side-note, as it only applies when an attacker is targeting one specific user - and even then if they know they frequent that site they don't have to social engineer them at all.
Furthermore, what's with the names in the exploit scenarios? I'm all for equality but I'd like to see names in there that are at least somewhat universally pronounceable and single-barrelled. Try something like Anne instead of Lorenzo de Medici. 86.14.89.251 ( talk) 10:58, 5 October 2008 (UTC)
In the section "Eliminating scripts", our article says:
I just added a {{dubious}} tag because I frankly don't think it is true, and the citation given doesn't support the claim! The supporting cite is:
However, the quote in BBC article does not claim that 73% of websites rely on JavaScript, it says that "A further 73% failed to make the grade because of their reliance on JavaScript for some of the website's functionality." (My emphasis.) The difference is subtle but critical; if I say in an unqualified way that "mechanism A relies on B", it will be understood to mean that without B, A doesn't work at all. However if we say that "mechanism A relies on B for some functions", then clearly A still works without B.
I think that this weaker claim is perfectly plausible -- but also highly misleading, because in many cases, the lost functionality is inconsequential, or even annoying. I have been using NoScript for nearly three years now, and while it is probably true that only 27% of websites make no use of JavaScript whatsoever, it certainly is not true that the other 73% all require it in order to work. I am a net junky, but I have only 11 sites whitelisted (apart from the defaults); all the scores of other things I do on the net just don't need it. In fact, my subjective impression is that overwhelmingly the most common usage of JavaScript is for randomisation of ad loading; so on those sites, disabling scripts does nothing but speed up page loading and reduce clutter. The next most common usage would be form validation -- the absence of which you will generally not even notice (especially as all the most common types of data entry errors aren't detected by client side scripts anyway.) I don't know what fraction of websites critically depend on JavaScript in order to work at all (if I knew, I'd just edit the article), but it's nothing like 73%. -- Securiger ( talk) 05:36, 8 October 2008 (UTC)
Is there any scope in this article for a non-technical explanation? What does "injecting" scripts mean? Can it be explained in non-technical language? I would request a short section with a "non-technical explanation". (Or has this issue been dealt with somewhere? The need, or not, of providing non-technical explanations...) Devadaru ( talk) 16:33, 23 January 2009 (UTC)
The list of prominent domains hacked seems like it's promoting XSS hackers' glory. I don't think the New Zealand herald, for example, or an obama discussion forum, are nearly as prominent as google or yahoo. Maybe we could cull the list down to 4 or 5. —Preceding unsigned comment added by 69.71.99.90 ( talk) 16:51, 5 March 2009 (UTC)
Sometimes one would like to use XSS explicitly to communicate different apps. HTML5 (IE8, FF3) support window.postMessage, that is told to support cross-site messaging. I think there should be a reference to all that stuff in the article. 200.75.30.10 ( talk) 14:06, 17 December 2009 (UTC)
In April 2010 an XSS vulnerability in JIRA was the stepping stone to the compromise of key Apache Software Foundation servers [1]. The method used in this XSS doesn't seem to match any of the ones in the article. -- Walter Görlitz ( talk) 22:36, 7 May 2010 (UTC)
References
The example at the end of the non-persistent section is actually a persistent attack. It is confusing. —Preceding unsigned comment added by 204.176.49.45 ( talk) 17:35, 5 May 2011 (UTC)
I see three major problems of the current article:
1. Although I am not a native English speaker, I am sure the author of the major part of this article is not a native English speaker. And more importantly, he/she is not capable of creating a wiki quality article in English. Examples: No example. You may read through the article to feel it.
2. The technical terms used in this article are non-standardized and even not consistent within the article itself. Some examples:
- “Cross-site scripting holes are web-application vulnerabilities”, “In recent years, cross-site scripting flaws”
- “Besides content filtering”
What is “content filtering”? I actually understand what it means. But why not just uses the terms which have been used in previous sections such as “output encoding” or “input validation”? This article is full of this kind of inconsistent terms.
3. The article has quite a lot of technical errors. Examples:
- The section about DOM based XSS is essentially wrong.
The author should read the OWASP link https://www.owasp.org/index.php/DOM_Based_XSS carefully and discuss with domain expert to ensure he really understands DOM based XSS before creating the section.
- “Safely validating untrusted HTML input”
This section itself is right but misleading. ALL input data from user / network / un-trusted source should be validated – not only the HTML input.
- “Some companies offer a periodic scan service, essentially simulating an attack from their server to a client's in order to check if the attack is successful”
Just one attack? No. The security scanning / assessment service generally tries all kinds of possible attacking types and vectors not just “an attack”.
- “There are several different escaping schemes … including HTML entity encoding, JavaScript escaping, CSS escaping, and URL (or percent) encoding”
This sentence missed a word “etc.” at the end of the list. There are more encoding schemes for XSS prevention besides the listed 4.
- “Most web applications that do not need to accept rich data can use escaping to largely eliminate the risk of XSS in a fairly straightforward manner.”
- “Encoding can be tricky”
Above two statements are self-contradict. Is encoding (escaping) fairly straightforward or tricky?
- “When Bob reads the message, Mallory's XSS steals Bob's cookie.”
No. It is not Mallory’s XSS. It is the web site’s XSS. And it is the script code injected by Mallory which steals Bob’s cookie. The wording here is too casual, too loose for a wiki article.
- “External links” section, the link “Simple XSS explanation”
Why the link is there? It points to a very poor quality paper - it is far from a qualified reference for a wiki page.
- Etc.
In summary, I suggest a total rewriting of the whole article – the OWASP page
https://www.owasp.org/index.php/XSS could be a good reference.
Condor Lee ( talk) 22:03, 5 June 2012 (UTC) Condor Lee
quote:
Some browsers or browser plugins can be configured to disable client-side scripts
i've never seen a browser that cannot disable javascripts.. if no1 can name a browser that cannot disable scripts, i will change that statement.. (im NOT talking about plugin/integrated browsers, i can mention 1 right now: Valve's Steam's integrated browser ;p) all mainstream browsers at least, including Opera, Firefox, Internet Explorer, Safari all allows this. Divinity76 ( talk) 12:02, 27 June 2012 (UTC)
IE on Windows Phone 8 does not allow disabling Javascript. — Preceding unsigned comment added by 162.212.105.3 ( talk) 03:25, 16 October 2014 (UTC)
I wonder do we confuse "XSS" term with "XSS vulnerability" term (or "XSS attacks", or "XSS based attacks", etc.) in this article?
Are "XSS" and "XSS vulnerability" really the same?
Michael V. Antosha (mivael) ( talk) 07:05, 11 October 2012 (UTC)
Should there be a section in Prevention about client-side filters such as XSS Auditor (Chrome) and NoScript (Firefox)? NoScript is mentioned for its ability to block all scripting, but it also includes a specific and advanced XSS filter, even if you enable JavaScript. XSS Auditor is less comprehensive (it has to be - it's targeting all Chrome users, instead of a security-minded subset of Firefox users), but it's significant. Carl.antuar ( talk) 06:15, 5 March 2014 (UTC)
Just an observation, but in view of the fact that pages cannot access cross-site cookies, I was quite suprprised -and indeed alarmed- to see that the same does not apply to Javascript. If script URLS could only be local files or URLs of the same root domain as the hosting page, that would nail all of the exploits bar the inline scripting ones, which are limited in scope without a foreign script to call. The existing situation seems to be a classic example of feature-bloat taking precedence over security. Though, I daresay a change now would break a lot of existing sites that rely on foreign scripts, for example jQuery being loaded from Google.
Mozilla browsers typically have an option to allow no cookies, local cookies or all cookies. Why not have the same for Javascript? -- Anteaus ( talk) 20:54, 28 May 2014 (UTC)
I think it's worth mentioning BBCode as a solution to avoid visitors injecting HTML tags such as <script> into their comments. Something commonly used on (nearly all) modern forum softwares. — Preceding unsigned comment added by 96.11.182.98 ( talk) 15:49, 28 December 2015 (UTC)
Perhaps there should be mention of so-called "semi-persistent" XSS, where the server negligently stores the payload in a cookie (or other client-side storage) that will later be used to assemble pages: http://research.zscaler.com/2009/05/cookie-based-persistent-xss.html Carl.antuar ( talk) 23:06, 24 January 2016 (UTC)
In some sections, "Mallory" is used as a woman's name and others it's used as a man's name. — Preceding unsigned comment added by Khatchad ( talk • contribs) 16:31, 2 March 2016 (UTC)
Several things could have been done to mitigate this attack:
The search input could have been sanitized which would include proper encoding checking. The web server could be set to redirect invalid requests.
There's no invalid request here.
Khatchad ( talk) 16:34, 2 March 2016 (UTC)
Hello fellow Wikipedians,
I have just modified one external link on Cross-site scripting. Please take a moment to review my edit. If you have any questions, or need the bot to ignore the links, or the page altogether, please visit this simple FaQ for additional information. I made the following changes:
{{
dead link}}
tag to
http://www.modsecurity.org/projects/modsecurity/apache/feature_universal_pdf_xss.htmlWhen you have finished reviewing my changes, you may follow the instructions on the template below to fix any issues with the URLs.
This message was posted before February 2018.
After February 2018, "External links modified" talk page sections are no longer generated or monitored by InternetArchiveBot. No special action is required regarding these talk page notices, other than
regular verification using the archive tool instructions below. Editors
have permission to delete these "External links modified" talk page sections if they want to de-clutter talk pages, but see the
RfC before doing mass systematic removals. This message is updated dynamically through the template {{
source check}}
(last update: 5 June 2024).
Cheers.— InternetArchiveBot ( Report bug) 09:52, 9 December 2017 (UTC)
Hey User:Jmccormac,
I saw that you reverted my edit to a broken link & noted that it was a promotional link. It wasn't meant to be promotional. It was a recent blog post that I wrote covering the Cross-Site Scripting bug. The link in wikipedia was broken, but in addition to that when using waybackmachine to check the broken link the content was outdated & lacking important points. Important details such as the different forms of XSS as well as the steps to mitigate the vulnerability were missing. The blog post I replaced the broken link covered those key points.
I did go back and review the blog post that I used to replace the broken link to figure out what might've made it 'promotional' -- It did have a form at the bottom to sign up for notifications of any future posts, which I removed this morning.
Please let me know if there are any other concerns about the article & how it might be considered promotional. If not, I'd appreciate the chance to have my replaced link stay on the article. Thanks! — Preceding unsigned comment added by X-Security-Austin ( talk • contribs)