From Rosetta Code
Jump to: navigation, search
You are encouraged to solve this task according to the task description, using any language you may know.

Print an HTTPS URL's content to the console. Checking the host certificate for validity is recommended. The client should not authenticate itself to the server — the webpage supports that access policy — as that is the subject of other tasks.

Readers may wish to contrast with the HTTP Request task, and also the task on HTTPS request with authentication.


[edit] Ada

Library: AWS

Exactly the same as the HTTP task, assuming you compiled AWS with openssl support.

with AWS.Client;
with AWS.Response;
with Ada.Text_IO; use Ada.Text_IO;
procedure GetHttps is
Put_Line (AWS.Response.Message_Body (AWS.Client.Get (
URL => "")));
end GetHttps;

[edit] AutoHotkey

Library: wininet
URL := ""
msgbox % html := UrlGetContents(URL)
#include urlgetcontents.ahk
#include wininet.ahk

[edit] Batch File

:: Must have curl.exe
curl.exe -k -s -L

[edit] C

Library: libcurl
#include <stdio.h>
#include <stdlib.h>
#include <curl/curl.h>
CURL *curl;
char buffer[CURL_ERROR_SIZE];
if ((curl = curl_easy_init()) != NULL) {
curl_easy_setopt(curl, CURLOPT_URL, "");
curl_easy_setopt(curl, CURLOPT_FOLLOWLOCATION, 1);
curl_easy_setopt(curl, CURLOPT_ERRORBUFFER, buffer);
if (curl_easy_perform(curl) != CURLE_OK) {
fprintf(stderr, "%s\n", buffer);

[edit] C#

Works with: C sharp version 3.0
using System;
using System.Net;
class Program
static void Main(string[] args)
var client = new WebClient();
var data = client.DownloadString("");

This does not work for urls requiring a secure (SSL) connection.

[edit] Clojure

Using the duck-streams as a convenient wrapper for Java's networking classes, grabbing the contents of an HTTPS URL is as easy as:

(use '[ :only (slurp*)])
(print (slurp* ""))

The usual Java mechanisms can be used to manage acceptance of SSL certificates if required.

Works with: Clojure version 1.2
(print (slurp ""))

[edit] Common Lisp

Library: DRAKMA

First grabbing the entire body as a string, and then by pulling from a stream. This is the same code as in HTTP Request; drakma:http-request supports SSL.

(defun wget-drakma-string (url &optional (out *standard-output*))
"Grab the body as a string, and write it to out."
(write-string (drakma:http-request url) out))
(defun wget-drakma-stream (url &optional (out *standard-output*))
"Grab the body as a stream, and write it to out."
(loop with body = (drakma:http-request url :want-stream t)
for line = (read-line body nil nil)
while line do (write-line line)
finally (close body)))
;; Use
(wget-drakma-stream "")

[edit] Delphi

Library: OpenSSL
program ShowHTTPS;
uses IdHttp, IdSSLOpenSSL;
s: string;
lHTTP := TIdHTTP.Create(nil);
lHTTP.IOHandler := TIdSSLIOHandlerSocketOpenSSL.Create(lHTTP);
lHTTP.HandleRedirects := True;
s := lHTTP.Get('');

[edit] Erlang

[edit] Synchronous

main([Url|[]]) ->
case http:request(get, {URL, []}, [{ssl,[{verify,0}]}], []) of
{ok, {_V, _H, Body}} -> io:fwrite("~p~n",[Body]);
{error, Res} -> io:fwrite("~p~n", [Res])

[edit] Asynchronous

main([Url|[]]) ->
http:request(get, {Url, [] }, [{ssl,[{verify,0}]}], [{sync, false}]),
{http, {_ReqId, Res}} -> io:fwrite("~p~n",[Res]);
_Any -> io:fwrite("Error: ~p~n",[_Any])
after 10000 -> io:fwrite("Timed out.~n",[])

Using it

|escript ./req.erl

[edit] F#

The underlying .NET classes handle secure web connections the same way they manage insecure connections.

let wget (url : string) =
let c = new System.Net.WebClient()

[edit] Frink


[edit] Go

package main
import (
func main() {
r, err := http.Get("")
if err != nil {
io.Copy(os.Stdout, r.Body)

[edit] Groovy

new URL("").eachLine { println it }

[edit] Haskell

Library: http-conduit
Works with: GHC version 7.4.1

This is just the example from Network.HTTP.Conduit, with the http URL replaced with an https one, since http-conduit natively supports https without needing any additional work.

import Network.HTTP.Conduit
import qualified Data.ByteString.Lazy as L
import Network (withSocketsDo)
main = withSocketsDo
$ simpleHttp "" >>= L.putStr

[edit] Ioke

Translation of: Java
connection = URL new("") openConnection
scanner = Scanner new(connection getInputStream)
while(scanner hasNext,
scanner next println

[edit] J

Using gethttp from Web Scraping

#page=: gethttp''
#page=: '--no-check-certificate' gethttp''

(We can not load the example page using https unless we disable certificate checking. The numbers are the number of characters retrieved.)

[edit] Java

Additional certificate information is available through the interface.

URL url = new URL("");
HttpsURLConnection connection = (HttpsURLConnection) url.openConnection();
Scanner scanner = new Scanner(connection.getInputStream());
while (scanner.hasNext()) {

[edit] JavaScript

(function(url,callback){//on some browsers you can check certificate information.
xhr=new XMLHttpRequest();'GET',url,true);

[edit] Lasso

local(x = curl(''))
local(y = #x->result)

If a site with an invalid SSL Cert is encountered the curl type throws the following error:

FAILURE: 60 Peer certificate cannot be authenticated with given CA certificates

[edit] LiveCode

Blocking version
libURLSetSSLVerification true  --check cert
get URL ""

Non-blocking version, execute getWebResource

on myUrlDownloadFinished
get URL "" -- this will now fetch a locally cached copy
put it
end myUrlDownloadFinished
command getWebResource
libURLFollowHttpRedirects true
libURLSetSSLVerification true --check cert
load URL "" with message "myUrlDownloadFinished"
end getWebResource

[edit] LSL

Virtually identical to the HTTP Task.

To test it yourself; rez a box on the ground, and add the following as a New Script.

string sURL = "https://SourceForge.Net/";
key kHttpRequestId;
default {
state_entry() {
kHttpRequestId = llHTTPRequest(sURL, [], "");
http_response(key kRequestId, integer iStatus, list lMetaData, string sBody) {
if(kRequestId==kHttpRequestId) {
integer x = 0;
for(x=0 ; x<llGetListLength(lMetaData) ; x++) {
llOwnerSay("llList2String(lMetaData, "+(string)x+")="+llList2String(lMetaData, x));
list lBody = llParseString2List(sBody, ["\n"], []);
for(x=0 ; x<llGetListLength(lBody) ; x++) {
llOwnerSay("llList2String(lBody, "+(string)x+")="+llList2String(lBody, x));


llList2String(lMetaData, 0)=0
llList2String(lMetaData, 1)=2048
llList2String(lBody, 0)=<!doctype html>
llList2String(lBody, 1)=<!-- Server: sfs-consume-7 -->
llList2String(lBody, 2)=<!--[if lt IE 7 ]> <html lang="en" class="no-js ie6" > <![endif]-->
llList2String(lBody, 3)=<!--[if IE 7 ]>    <html lang="en" class="no-js ie7" > <![endif]-->
llList2String(lBody, 4)=<!--[if IE 8 ]>    <html lang="en" class="no-js ie8" > <![endif]-->
llList2String(lBody, 5)=<!--[if IE 9 ]>    <html lang="en" class="no-js ie9" > <![endif]-->
llList2String(lBody, 6)=<!--[if (gt IE 9)|!(IE)]>--> <html lang="en" class="no-js"> <!--<![endif]-->
llList2String(lBody, 7)=    <head>
llList2String(lBody, 8)=        <meta charset="utf-8">
llList2String(lBody, 9)=        
llList2String(lBody, 10)=        <meta id="webtracker" name="webtracker" content='{"event_id": "ea71f064-ca28-11e1-98cc-0019b9f0e8fc"}'>
llList2String(lBody, 11)=        <meta name="description" content="Free, secure and fast downloads from the largest Open Source applications and software directory -">
llList2String(lBody, 12)=        <meta name="keywords" content="Open Source, Open Source Software, Development, Community, Source Code, Secure,  Downloads, Free Software">
llList2String(lBody, 13)=<meta name="msvalidate.01" content="0279349BB9CF7ACA882F86F29C50D3EA" />
llList2String(lBody, 14)=        <meta name="viewport" content="width=device-width, initial-scale=1.0">
llList2String(lBody, 15)=        <title>SourceForge - Download, Develop and Publish Free Open Source Software</title>
llList2String(lBody, 16)=        <link rel="shortcut icon" href="">
...   ...   ...   ...   ...   ...   ...   ...   ...   ...   ...   ...   ...   ...

[edit] Maple

content := URL:-Get( "" );

[edit] Mathematica

Straight forward "Import" task. More complicated secure web access can be done using J/Link; essentially a link to Java API.

content=Import["", "HTML"]

[edit] MATLAB / Octave


[edit] Nemerle

This example is essentially identical to the HTTP task because the WebClient object can be used with http:, https:, ftp: and file: uri's.

using System;
using System.Console;
using System.Net;
using System.IO;
module HTTP
Main() : void
def wc = WebClient();
def myStream = wc.OpenRead(;
def sr = StreamReader(myStream);

[edit] Nim

import httpclient
echo getContent ""

[edit] Objeck

use HTTP;
class HttpsTest {
function : Main(args : String[]) ~ Nil {
client := HttpsClient->New();
lines := client->Get("");
each(i : lines) {

[edit] Perl

Library: LWP
use strict;
use LWP::UserAgent;
my $url = '';
my $response = LWP::UserAgent->new->get( $url );
$response->is_success or die "Failed to GET '$url': ", $response->status_line;
print $response->as_string;

[edit] PHP

echo file_get_contents('');

[edit] PicoLisp

PicoLisp has no functionality for communicating with a HTTPS server (only for the other direction), but it is easy to use an external tool

(in '(curl "") # Open a pipe to 'curl'
(out NIL (echo)) ) # Echo to standard output

[edit] Pike

int main() {
write("%s\n", Protocols.HTTP.get_url_data(""));

[edit] PowerShell

$wc = New-Object Net.WebClient

If the certificate could not be validated (untrusted, self-signed, expired), then an Exception is thrown with the message “The underlying connection was closed: Could not establish trust relationship for the SSL/TLS secure channel.” so certificate validation is done automatically by the method.

[edit] Python

Python's urllib.request library, (urllib2 in Python2.x) has support for SSL if the interpreter's underlying httplib libraries were compiled with SSL support. By default this will be the enabled for default Python installations on most platforms.

Python 3.x:

from urllib.request import urlopen

(Python 2.x)

from urllib2 import urlopen
print urlopen('').read()

[edit] R

Library: RCurl
Library: XML

The basic idea is to use getURL (as with HTTP_Request), but with some extra parameters.

webpage <- getURL("", .opts=list(followlocation=TRUE, ssl.verifyhost=FALSE, ssl.verifypeer=FALSE))

In this case, the webpage output contains unprocessed characters, e.g. \" instead of " and \\ instead of \, so we need to process the markup.

wp <- readLines(tc <- textConnection(webpage))

Finally, we parse the HTML and find the interesting bit.

pagetree <- htmlTreeParse(wp)

[edit] Racket

#lang racket
(require net/url)
(copy-port (get-pure-port (string->url "")
#:redirections 100)

[edit] REALbasic

REALBasic provides an HTTPSecureSocket class for handling HTTPS connections. The 'Get' method of the HTTPSecureSocket is overloaded and can download data to a file or return data as a string, in both cases an optional timeout argument can be passed.

Dim sock As New HTTPSecureSocket
Print(sock.Get("", 10)) //set the timeout period to 10 seconds.

[edit] RLaB


[edit] Ruby

This solution doesn't use the open-uri convenience package that the HTTP Request#Ruby solution uses: the Net::HTTP object must be told to use SSL before the session is started.

require 'net/https'
require 'uri'
require 'pp'
uri = URI.parse('')
http =,uri.port)
http.use_ssl = true
http.verify_mode = OpenSSL::SSL::VERIFY_NONE
http.start do
content = http.get("/")
p [content.code, content.message]
pp content.to_hash
puts content.body


["302", "Found"]
 "content-type"=>["text/html; charset=UTF-8"],
 "date"=>["Sun, 30 Aug 2009 20:20:07 GMT"],
  ["sf.consume=89f65c6fadd222338b2f3de6f8e8a17b2c8f67c2gAJ9cQEoVQhfZXhwaXJlc3ECY2RhdGV0aW1lCmRhdGV0aW1lCnEDVQoH9gETAw4HAAAAhVJxBFUDX2lkcQVVIDEyOWI2MmVkOWMwMWYxYWZiYzE5Y2JhYzcwZDMxYTE4cQZVDl9hY2Nlc3NlZF90aW1lcQdHQdKmt73UN21VDl9jcmVhdGlvbl90aW1lcQhHQdKmt73UN2V1Lg==; expires=Tue, 19-Jan-2038 03:14:07 GMT; Path=/"]}
  <title>302 Found</title>
  <h1>302 Found</h1>
  The resource was found at <a href=""></a>;
you should be redirected automatically.


[edit] Scala

Library: Scala
object HttpsTest extends App {
System.setProperty("http.agent", "*")

[edit] Seed7

The library gethttps.s7i defines the function getHttps which uses the HTTPS protocol go get a file.

$ include "seed7_05.s7i";
include "gethttps.s7i";
include "utf8.s7i";
const proc: main is func
writeln(STD_UTF8_OUT, getHttps(""));
end func;

[edit] Sidef

var lwp = require('LWP::UserAgent');    # LWP::Protocol::https is needed
var url = '';
var ua =
agent => 'Mozilla/5.0',
ssl_opts => => 1),
var resp = ua.get(url);
|| die "Failed to GET #{url.dump}: #{resp.status_line}\n";
print resp.decoded_content;

[edit] Swift

import Foundation
// With https
let request = NSURLRequest(URL: NSURL(string: "")!)
NSURLConnection.sendAsynchronousRequest(request, queue: NSOperationQueue()) {res, data, err in // callback
// data is binary
if (data != nil) {
let string = NSString(data: data!, encoding: NSUTF8StringEncoding)
CFRunLoopRun() // dispatch

[edit] Tcl

Though Tcl's built-in http package does not understand SSL, it does support the registration of external handlers to accommodate additional protocols. This allows the use of the Tls package to supply the missing functionality with only a single line to complete the registration.

package require http
package require tls
# Tell the http package what to do with “https:” URLs.
# First argument is the protocol name, second the default port, and
# third the connection builder command
http::register "https" 443 ::tls::socket
# Make a secure connection, which is almost identical to normal
# connections except for the different protocol in the URL.
set token [http::geturl ""]
# Now as for conventional use of the “http” package
puts [http::data $token]
http::cleanup $token



[edit] UNIX Shell

curl -k -s -L

[edit] VBScript

Based on code at How to retrieve HTML web pages with VBScript via the Microsoft.XmlHttp object

Option Explicit
Const sURL=""
Set oHTTP = CreateObject("Microsoft.XmlHTTP")
On Error Resume Next
oHTTP.Open "GET", sURL, False
oHTTP.Send ""
If Err.Number = 0 Then
WScript.Echo oHTTP.responseText
Wscript.Echo "error " & Err.Number & ": " & Err.Description
End If
Set oHTTP = Nothing

[edit] Visual Basic .NET

Imports System.Net
Dim client As WebClient = New WebClient()
Dim content As String = client.DownloadString("")

[edit] zkl

Using the cURL library to do the heavy lifting:

zkl: var ZC=Import("zklCurl")
zkl: var data=ZC().get("")

get returns the text of the response along with two counts: the bytes of header in front of the html code and the byte count of stuff after the end of the page. So, if you wanted to look at the header:

zkl: data[0][0,data[1]).text
HTTP/1.1 200 OK
Server: nginx
Date: Sun, 23 Mar 2014 07:36:51 GMT
Content-Type: text/html; charset=utf-8
Connection: close

or some of the html:

zkl: data[0][data[1],200).text
<!doctype html>
<!-- Server: sfs-consume-8 -->

<!--[if lt IE 7 ]> <html lang="en" class="no-js ie6"> <![endif]-->
<!--[if IE 7 ]>    <html lang="en" class="no-js ie7"> <![endif]-->
<!--[if IE 8 ]>   
Personal tools