Patrick,
Just a heads up, the string "://" more than once in a url is valid.
For example, go to any site, (e.g. "http://www.google.com://") as you
can see, this is okay. If you read the RFC, you would find that it is
perfectly fine as well. A better solution is to just use a standard
url checker. Here's your one liner :):
# given your "url"
unless url.match(/^(ftp|http|https):\/\/(\w+:{0,1}\w*@)?(\S+)(:[0-9]
+)?(\/|\/([\w#!:.?+=&[EMAIL PROTECTED]/]))?$/).nil?
errors.add(:url, "invalid url...")
end
-Jordan
On Aug 23, 2006, at 10:52 PM, Patrick Crowley wrote:
I need to test for URL submissions that contain duplicate protocols
-- which usually means a user pasted a link into a text field too
many times.
In PHP, I'd use the substr_match function for this... but I can't
find a good Ruby replacement.
Here's the test code I've written so far:
# fake url
url = 'http:/site.com/link.ics http://site.com/link.ics http://
site.com/link.ics'
#count the number of times '://' appears
count = 0
url.scan /:\/\// do |match|
count += 1
end
#is this a bad url?
if count > 2
puts 'this is a bad url'
end
This works, but doesn't seem like the Ruby way. Is there a better
path?
....
Ultimately, I'm aiming for some crazy one-line Ruby magic like this...
if url.some.ruby.method./(:\/\/)/ > 2
errors.add(:url, "is too long. Bad cut and paste?")
end
Best,
Patrick
_______________________________________________
Sdruby mailing list
[email protected]
http://lists.sdruby.com/mailman/listinfo/sdruby
_______________________________________________
Sdruby mailing list
[email protected]
http://lists.sdruby.com/mailman/listinfo/sdruby